WorldWideScience

Sample records for perform quantitative analysis

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  3. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  4. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  5. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  6. High-performance hybrid Orbitrap mass spectrometers for quantitative proteome analysis

    DEFF Research Database (Denmark)

    Williamson, James C; Edwards, Alistair V G; Verano-Braga, Thiago

    2016-01-01

    We present basic workups and quantitative comparisons for two current generation Orbitrap mass spectrometers, the Q Exactive Plus and Orbitrap Fusion Tribrid, which are widely considered two of the highest performing instruments on the market. We assessed the performance of two quantitative methods...... on both instruments, namely label-free quantitation and stable isotope labeling using isobaric tags, for studying the heat shock response in Escherichia coli. We investigated the recently reported MS3 method on the Fusion instrument and the potential of MS3-based reporter ion isolation Synchronous...... Precursor Selection (SPS) and its impact on quantitative accuracy. We confirm that the label-free approach offers a more linear response with a wider dynamic range than MS/MS-based isobaric tag quantitation and that the MS3/SPS approach alleviates but does not eliminate dynamic range compression. We...

  7. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  8. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  9. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    Science.gov (United States)

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation forensic toxicology laboratory. PMID:27635251

  10. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  11. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  12. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    Science.gov (United States)

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  13. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  15. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, J., E-mail: takahashi.3ct.jun@jp.nssmc.com [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Kawakami, K. [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Raabe, D. [Max-Planck Institut für Eisenforschung GmbH, Department for Microstructure Physics and Alloy Design, Max-Planck-Str. 1, 40237 Düsseldorf (Germany)

    2017-04-15

    Highlights: • Quantitative analysis in Fe-Cu alloy was investigated in voltage and laser atom probe. • In voltage-mode, apparent Cu concentration exceeded actual concentration at 20–40 K. • In laser-mode, the concentration never exceeded the actual concentration even at 20 K. • Detection loss was prevented due to the rise in tip surface temperature in laser-mode. • Preferential evaporation of solute Cu was reduced in laser-mode. - Abstract: The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40 K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20 K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions.

  16. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  17. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. Evaluation of breast lesions by contrast enhanced ultrasound: Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Wan Caifeng; Du Jing; Fang Hua; Li Fenghua; Wang Lin

    2012-01-01

    Objective: To evaluate and compare the diagnostic performance of qualitative, quantitative and combined analysis for characterization of breast lesions in contrast enhanced ultrasound (CEUS), with histological results used as the reference standard. Methods: Ninety-one patients with 91 breast lesions BI-RADS 3–5 at US or mammography underwent CEUS. All lesions underwent qualitative and quantitative enhancement evaluation. Receiver operating characteristic (ROC) curve analysis was performed to evaluate the diagnostic performance of different analytical method for discrimination between benign and malignant breast lesions. Results: Histopathologic analysis of the 91 lesions revealed 44 benign and 47 malignant. For qualitative analysis, benign and malignant lesions differ significantly in enhancement patterns (p z1 ), 0.768 (A z2 ) and 0.926(A z3 ) respectively. The values of A z1 and A z3 were significantly higher than that for A z2 (p = 0.024 and p = 0.008, respectively). But there was no significant difference between the values of A z1 and A z3 (p = 0.625). Conclusions: The diagnostic performance of qualitative and combined analysis was significantly higher than that for quantitative analysis. Although quantitative analysis has the potential to differentiate benign from malignant lesions, it has not yet improved the final diagnostic accuracy.

  20. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  1. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe.

    Science.gov (United States)

    Takahashi, J; Kawakami, K; Raabe, D

    2017-04-01

    The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  3. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  4. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    Science.gov (United States)

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  5. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  6. Quantitative analysis of total retronecine esters-type pyrrolizidine alkaloids in plant by high performance liquid chromatography

    International Nuclear Information System (INIS)

    Zhang Fang; Wang Changhong; Xiong Aizhen; Wang Wan; Yang Li; Branford-White, Christopher J.; Wang Zhengtao; Bligh, S.W. Annie

    2007-01-01

    Pyrrolizidine alkaloids (PAs) are alkaloids which typically contain a necine (7-hydroxy-1-hydroxymethyl-6,7-dihydro-5H-pyrrolizidine) base unit, and they can be found in one third of the higher plants around the world. They are hepatotoxic, mutagenic and carcinogenic and pose a threat to human health and safety. A specific, quick and sensitive method is therefore needed to detect and quantify the PAs sometimes in trace amount in herbs, tea or food products. Based on high performance liquid chromatography with prior derivatization of the alkaloids using o-chloranil and Ehrlich's reagent, we report an improved method for quantitative analysis of the total amount of retronecine esters-type pyrrolizidine alkaloids (RET-PAs) in a plant extract. The total quantitation of RET-PAs is achieved because of a common colored retronecine marker, a 7-ethoxy-1-ethoxylmethyl retronecine derivative, is produced with all the different RET-PAs during the derivatization reaction. The chemical identity of the common retronecine marker was characterized on-line by positive mode electrospray ionization mass spectrometry and nuclear magnetic resonance spectroscopy. The limit of detection using the improved method is 0.26 nmol mL -1 and the limit of quantitation is 0.79 nmol mL -1 . The advantages of this method are much enhanced sensitivity in detection and quantitation, and, no restriction on the choice of RET-PA as a calibration standard. Application of the developed method to the quantitation of total RET esters-type PAs in Senecio scandens from different regions of China is also reported

  7. Quantitative analysis of total retronecine esters-type pyrrolizidine alkaloids in plant by high performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Fang; Wang Changhong; Xiong Aizhen; Wang Wan; Yang Li [Key Laboratory of Standardization of Chinese Medicines of Ministry of Education, Shanghai University of Traditional Chinese Medicine, 1200 Cai Lun Road, Zhangjiang Hi-Tech Park, Shanghai 201203 (China); Branford-White, Christopher J. [Institute for Health Research and Policy, London Metropolitan University, 166-220 Holloway Road, London N7 8DB (United Kingdom); Wang Zhengtao [Key Laboratory of Standardization of Chinese Medicines of Ministry of Education, Shanghai University of Traditional Chinese Medicine, 1200 Cai Lun Road, Zhangjiang Hi-Tech Park, Shanghai 201203 (China); School of Chinese Pharmacy, China Pharmaceutical University, Nanjing 210038 (China)], E-mail: wangzt@shutcm.edu.cn; Bligh, S.W. Annie [Institute for Health Research and Policy, London Metropolitan University, 166-220 Holloway Road, London N7 8DB (United Kingdom)], E-mail: a.bligh@londonmet.ac.uk

    2007-12-12

    Pyrrolizidine alkaloids (PAs) are alkaloids which typically contain a necine (7-hydroxy-1-hydroxymethyl-6,7-dihydro-5H-pyrrolizidine) base unit, and they can be found in one third of the higher plants around the world. They are hepatotoxic, mutagenic and carcinogenic and pose a threat to human health and safety. A specific, quick and sensitive method is therefore needed to detect and quantify the PAs sometimes in trace amount in herbs, tea or food products. Based on high performance liquid chromatography with prior derivatization of the alkaloids using o-chloranil and Ehrlich's reagent, we report an improved method for quantitative analysis of the total amount of retronecine esters-type pyrrolizidine alkaloids (RET-PAs) in a plant extract. The total quantitation of RET-PAs is achieved because of a common colored retronecine marker, a 7-ethoxy-1-ethoxylmethyl retronecine derivative, is produced with all the different RET-PAs during the derivatization reaction. The chemical identity of the common retronecine marker was characterized on-line by positive mode electrospray ionization mass spectrometry and nuclear magnetic resonance spectroscopy. The limit of detection using the improved method is 0.26 nmol mL{sup -1} and the limit of quantitation is 0.79 nmol mL{sup -1}. The advantages of this method are much enhanced sensitivity in detection and quantitation, and, no restriction on the choice of RET-PA as a calibration standard. Application of the developed method to the quantitation of total RET esters-type PAs in Senecio scandens from different regions of China is also reported.

  8. WE-G-207-05: Relationship Between CT Image Quality, Segmentation Performance, and Quantitative Image Feature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J; Nishikawa, R [University of Pittsburgh, Pittsburgh, PA (United States); Reiser, I [The University of Chicago, Chicago, IL (United States); Boone, J [UC Davis Medical Center, Sacramento, CA (United States)

    2015-06-15

    Purpose: Segmentation quality can affect quantitative image feature analysis. The objective of this study is to examine the relationship between computed tomography (CT) image quality, segmentation performance, and quantitative image feature analysis. Methods: A total of 90 pathology proven breast lesions in 87 dedicated breast CT images were considered. An iterative image reconstruction (IIR) algorithm was used to obtain CT images with different quality. With different combinations of 4 variables in the algorithm, this study obtained a total of 28 different qualities of CT images. Two imaging tasks/objectives were considered: 1) segmentation and 2) classification of the lesion as benign or malignant. Twenty-three image features were extracted after segmentation using a semi-automated algorithm and 5 of them were selected via a feature selection technique. Logistic regression was trained and tested using leave-one-out-cross-validation and its area under the ROC curve (AUC) was recorded. The standard deviation of a homogeneous portion and the gradient of a parenchymal portion of an example breast were used as an estimate of image noise and sharpness. The DICE coefficient was computed using a radiologist’s drawing on the lesion. Mean DICE and AUC were used as performance metrics for each of the 28 reconstructions. The relationship between segmentation and classification performance under different reconstructions were compared. Distributions (median, 95% confidence interval) of DICE and AUC for each reconstruction were also compared. Results: Moderate correlation (Pearson’s rho = 0.43, p-value = 0.02) between DICE and AUC values was found. However, the variation between DICE and AUC values for each reconstruction increased as the image sharpness increased. There was a combination of IIR parameters that resulted in the best segmentation with the worst classification performance. Conclusion: There are certain images that yield better segmentation or classification

  9. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  10. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  11. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  12. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  13. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  14. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  15. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  16. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  17. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Fingerprint analysis, multi-component quantitation, and antioxidant activity for the quality evaluation of Salvia miltiorrhiza var. alba by high-performance liquid chromatography and chemometrics.

    Science.gov (United States)

    Zhang, Danlu; Duan, Xiaoju; Deng, Shuhong; Nie, Lei; Zang, Hengchang

    2015-10-01

    Salvia miltiorrhiza Bge. var. alba C.Y. Wu and H.W. Li has wide prospects in clinical practice. A useful comprehensive method was developed for the quality evaluation of S. miltiorrhiza var. alba by three quantitative parameters: high-performance liquid chromatography fingerprint, ten-component contents, and antioxidant activity. The established method was validated for linearity, precision, repeatability, stability, and recovery. Principal components analysis and hierarchical clustering analysis were both used to evaluate the quality of the samples from different origins. The results showed that there were category discrepancies in quality of S. miltiorrhiza var. alba samples according to the three quantitative parameters. Multivariate linear regression was adopted to explore the relationship between components and antioxidant activity. Three constituents, namely, danshensu, rosmarinic acid, and salvianolic acid B, significantly correlated with antioxidant activity, and were successfully elucidated by the optimized multivariate linear regression model. The combined use of high-performance liquid chromatography fingerprint analysis, simultaneous multicomponent quantitative analysis, and antioxidant activity for the quality evaluation of S. miltiorrhiza var. alba is a reliable, comprehensive, and promising approach, which might provide a valuable reference for other herbal products in general to improve their quality control. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  20. Utility of DWI with quantitative ADC values in ovarian tumors: a meta-analysis of diagnostic test performance.

    Science.gov (United States)

    Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui

    2018-01-01

    Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.

  1. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  2. The Performance of Indian Equity Funds in the Era of Quantitative Easing

    Directory of Open Access Journals (Sweden)

    Ömer Faruk Tan

    2015-10-01

    Full Text Available This study aims to evaluate the performance of Indian equity funds between January 2009 and October 2014. This study period coincides with the period of quantitative easing during which the developing economies in financial markets have been influenced. After the global financial crisis of 2008 came a period of quantitative easing (QE, creating an increase in the money supply and leading to a capital flow from developed countries to developing countries. During this 5-year 10-month period, in which the relevant quantitative easing continued, Indian CNX500 price index yielded approximately 21% compounded on average, per annum. In this study, Indian equity funds are examined in order to compare these funds’ performance within this period. Within this scope, 12 Indian equity funds are chosen. In order to measure these funds’ performances, the Sharpe ratio (1966, Treynor ratio (1965, Jensen’s alpha (1968 methods are used. Jensen’s alpha is also used in identifying selectivity skills of fund managers. Additionally, the Treynor & Mazuy (1966 regression analysis method is applied to show the market timing ability of fund managers.

  3. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    Science.gov (United States)

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  4. Quantitative aspects of the clinical performance of transverse tripolar spinal cord stimulation

    NARCIS (Netherlands)

    Wesselink, W.A.; Holsheimer, J.; King, Gary W.; Torgerson, Nathan A.; Boom, H.B.K.

    1999-01-01

    A multicenter study was initiated to evaluate the performance of the transverse tripolar system for spinal cord stimulation. Computer modeling had predicted steering of paresthesia with a dual channel stimulator to be the main benefit of the system. The quantitative analysis presented here includes

  5. [Qualitative and quantitative analysis of amygdalin and its metabolite prunasin in plasma by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry].

    Science.gov (United States)

    Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi

    2014-06-01

    A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.

  6. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  7. Quantitative Motion Analysis of Tai Chi Chuan: The Upper Extremity Movement

    Directory of Open Access Journals (Sweden)

    Tsung-Jung Ho

    2018-01-01

    Full Text Available The quantitative and reproducible analysis of the standard body movement in Tai Chi Chuan (TCC was performed in this study. We aimed to provide a reference of the upper extremities for standardizing TCC practice. Microsoft Kinect was used to record the motion during the practice of TCC. The preparation form and eight essential forms of TCC performed by an instructor and 101 practitioners were analyzed in this study. The instructor completed an entire TCC practice cycle and performed the cycle 12 times. An entire cycle of TCC was performed by practitioners and images were recorded for statistics analysis. The performance of the instructor showed high similarity (Pearson correlation coefficient (r=0.71~0.84 to the first practice cycle. Among the 9 forms, lay form had the highest similarity (rmean=0.90 and push form had the lowest similarity (rmean=0.52. For the practitioners, ward off form (rmean=0.51 and roll back form (rmean=0.45 had the highest similarity with moderate correlation. We used Microsoft Kinect to record the spatial coordinates of the upper extremity joints during the practice of TCC and the data to perform quantitative and qualitative analysis of the joint positions and elbow joint angle.

  8. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  9. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  10. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  11. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    Science.gov (United States)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of 30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  12. A new quantitative analysis on nitriding kinetics in the oxidized Zry-4 at 900-1200 .deg. C

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sanggi [ACT Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    Two major roles of nitrogen on the zirconium based cladding degradation were identified: mechanical degradation of the cladding, and the additional chemical heat release. It has long been known that accelerated oxidation can occur in air due to the nitrogen. In addition, significant uptake of nitrogen can also occur. The nitriding of pre-oxidized zirconium based alloys leads to micro porous and less coherent oxide scales. This paper aims to quantitatively investigate the nitriding mechanism and kinetics by proposing a new methodology that is coupled with the mass balance analysis and the optical microscope image processing analysis. A new quantitative analysis methodology is described in chapter 2 and the investigation of the nitriding kinetics is performed in chapter 3. The experimental details are previously reported in. Previously only qualitative analysis was performed in, and hence the quantitative analysis will be performed in this paper. In this paper, the nitriding kinetics and mechanism were quantitatively analyzed by the new proposed analysis methods: the mass balance analysis and the optical microscope image processing analysis. Using these combined methods, the mass gain curves and the optical microscopes are analyzed in very detail, and the mechanisms of nitriding accelerated, stabilized and saturated behaviors were well understood. This paper has two very distinctive achievements as follows: 1) Development of very effective quantitative analysis methods only using two main results of oxidation tests: No detailed analytical sample measurements (e.g. TEM, EPMA and so on.) were required. These methods can effectively reduce the cost and effort of the post-test investigation. 2) The first identification of the nitriding behaviors and its very accurate analysis in a quantitative way. Based on this quantitative analysis results on the nitriding kinetics, these new findings will contribute significantly the understanding the air oxidation behaviors and model

  13. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI.

    Science.gov (United States)

    Bierry, Guillaume; Venkatasamy, Aïna; Kremer, Stéphane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-04-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation ("CT edema") on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84%, specificity of 97%, and accuracy of 95%, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation.

  14. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  15. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  16. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  17. Single particle transfer for quantitative analysis with total-reflection X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Esaka, Fumitaka; Esaka, Konomi T.; Magara, Masaaki; Sakurai, Satoshi; Usuda, Shigekazu; Watanabe, Kazuo

    2006-01-01

    The technique of single particle transfer was applied to quantitative analysis with total-reflection X-ray fluorescence (TXRF) spectrometry. The technique was evaluated by performing quantitative analysis of individual Cu particles with diameters between 3.9 and 13.2 μm. The direct quantitative analysis of the Cu particle transferred onto a Si carrier gave a discrepancy between measured and calculated Cu amounts due to the absorption effects of incident and fluorescent X-rays within the particle. By the correction for the absorption effects, the Cu amounts in individual particles could be determined with the deviation within 10.5%. When the Cu particles were dissolved with HNO 3 solution prior to the TXRF analysis, the deviation was improved to be within 3.8%. In this case, no correction for the absorption effects was needed for quantification

  18. Quantitative analysis of elastography images in the detection of breast cancer

    International Nuclear Information System (INIS)

    Landoni, V.; Francione, V.; Marzi, S.; Pasciuti, K.; Ferrante, F.; Saracca, E.; Pedrini, M.; Strigari, L.; Crecco, M.; Di Nallo, A.

    2012-01-01

    Purpose: The aim of this study was to develop a quantitative method for breast cancer diagnosis based on elastosonography images in order to reduce whenever possible unnecessary biopsies. The proposed method was validated by correlating the results of quantitative analysis with the diagnosis assessed by histopathologic exam. Material and methods: 109 images of breast lesions (50 benign and 59 malignant) were acquired with the traditional B-mode technique and with elastographic modality. Images in Digital Imaging and COmmunications in Medicine format (DICOM) were exported into a software, written in Visual Basic, especially developed to perform this study. The lesion was contoured and the mean grey value and softness inside the region of interest (ROI) were calculated. The correlations between variables were investigated and receiver operating characteristic (ROC) curve analysis was performed to assess the diagnostic accuracy of the proposed method. Pathologic results were used as standard reference. Results: Both the mean grey value and the softness inside the ROI resulted statistically different at the t test for the two populations of lesions (i.e., benign versus malignant): p < 0.0001. The area under the curve (AUC) was 0.924 (0.834–0.973) and 0.917 (0.826–0.970) for the mean grey value and for the softness respectively. Conclusions: Quantitative elastosonography is a promising ultrasound technique in the detection of breast cancer but large prospective trials are necessary to determine whether quantitative analysis of images can help to overcome some pitfalls of the methodic.

  19. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  20. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  1. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    OpenAIRE

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them...

  2. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  3. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  4. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  5. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  6. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  7. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  8. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  9. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  10. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  11. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    Science.gov (United States)

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  12. Quantitative analysis of cocaine and its metabolites in whole blood and urine by high-performance liquid chromatography coupled with tandem mass spectrometry.

    Science.gov (United States)

    Johansen, Sys Stybe; Bhatia, Helle Merete

    2007-06-01

    In forensic toxicology it is important to have specific and sensitive analysis for quantification of illicit drugs in biological matrices. This paper describes a quantitative method for determination of cocaine and its major metabolites (ecgonine methyl ester, benzoylecgonine, norcocaine and ethylene cocaine) in whole blood and urine by liquid chromatography coupled with tandem mass spectrometry LC/MS/MS. The sample pre-treatment (0.20 g) consisted of acid precipitation, followed by centrifugation and solid phase extraction of supernatant using mixed mode sorbent columns (SPEC MP1 Ansys Diag. Inc.). Chromatographic separation was performed at 30 degrees C on a reverse phase Zorbax C18 column with a gradient system consisting of formic acid, water and acetonitrile. The analysis was performed by positive electrospray ionisation with a triple quadropole mass spectrometer operating in multiple reaction monitoring (MRM) mode. Two MRM transitions of each analyte were established and identification criteria were set up based on the retention time and the ion ratio. The quantification was performed using deuterated internal analytes of cocaine, benzoylecgonine and ecgonine methyl ester. The calibration curves of extracted standards were linear over a working range of 0.001-2.00 mg/kg whole blood for all analytes. The limit of quantification was 0.008 mg/kg; the interday precision (measured by relative standard deviation-%RSD) was less than 10% and the accuracy (BIAS) less than 12% for all analytes in whole blood. Urine samples were estimated semi-quantitatively at a cut-off level of 0.15 mg/kg with an interday precision of 15%. A liquid chromatography mass spectrometric (LC/MS/MS) method has been developed for confirmation and quantification of cocaine and its metabolites (ecgonine methyl ester, benzoylecgonine, norcocaine and ethylene cocaine) in whole blood and semi-quantitative in urine. The method is specific and sensitive and offers thereby an excellent alternative to

  13. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    Energy Technology Data Exchange (ETDEWEB)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis [University Hospital of Strasbourg, Department of Radiology, Strasbourg (France)

    2014-04-15

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  14. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    International Nuclear Information System (INIS)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-01-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  17. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  18. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  19. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  20. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  1. Chemical Fingerprint and Quantitative Analysis for the Quality Evaluation of Platycladi cacumen by Ultra-performance Liquid Chromatography Coupled with Hierarchical Cluster Analysis.

    Science.gov (United States)

    Shan, Mingqiu; Li, Sam Fong Yau; Yu, Sheng; Qian, Yan; Guo, Shuchen; Zhang, Li; Ding, Anwei

    2018-01-01

    Platycladi cacumen (dried twigs and leaves of Platycladus orientalis (L.) Franco) is a frequently utilized Chinese medicinal herb. To evaluate the quality of the phytomedcine, an ultra-performance liquid chromatographic method with diode array detection was established for chemical fingerprinting and quantitative analysis. In this study, 27 batches of P. cacumen from different regions were collected for analysis. A chemical fingerprint with 20 common peaks was obtained using Similarity Evaluation System for Chromatographic Fingerprint of Traditional Chinese Medicine (Version 2004A). Among these 20 components, seven flavonoids (myricitrin, isoquercitrin, quercitrin, afzelin, cupressuflavone, amentoflavone and hinokiflavone) were identified and determined simultaneously. In the method validation, the seven analytes showed good regressions (R ≥ 0.9995) within linear ranges and good recoveries from 96.4% to 103.3%. Furthermore, with the contents of these seven flavonoids, hierarchical clustering analysis was applied to distinguish the 27 batches into five groups. The chemometric results showed that these groups were almost consistent with geographical positions and climatic conditions of the production regions. Integrating fingerprint analysis, simultaneous determination and hierarchical clustering analysis, the established method is rapid, sensitive, accurate and readily applicable, and also provides a significant foundation for quality control of P. cacumen efficiently. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  3. Quantitative Analysis on the Influence Factors of the Sustainable Water Resource Management Performance in Irrigation Areas: An Empirical Research from China

    Directory of Open Access Journals (Sweden)

    Hulin Pan

    2018-01-01

    Full Text Available Performance evaluation and influence factors analysis are vital to the sustainable water resources management (SWRM in irrigation areas. Based on the objectives and the implementation framework of modern integrated water resources management (IWRM, this research systematically developed an index system of the performances and their influence factors ones of the SWRM in irrigation areas. Using the method of multivariate regression combined with correlation analysis, this study estimated quantitatively the effect of multiple factors on the water resources management performances of irrigation areas in the Ganzhou District of Zhangye, Gansu, China. The results are presented below. The overall performance is mainly affected by management enabling environment and management institution with the regression coefficients of 0.0117 and 0.0235, respectively. The performance of ecological sustainability is mainly influenced by local economic development level and enable environment with the regression coefficients of 0.08642 and −0.0118, respectively. The performance of water use equity is mainly influenced by information publicity, administrators’ education level and ordinary water users’ participation level with the correlation coefficients of 0.637, 0.553 and 0.433, respectively. The performance of water use economic efficiency is mainly influenced by the management institutions and instruments with the regression coefficients of −0.07844 and 0.01808, respectively. In order to improve the overall performance of SWRM in irrigation areas, it is necessary to strengthen the public participation, improve the manager’ ability and provide sufficient financial support on management organization.

  4. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  5. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  6. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  7. A Quantitative Analysis for the Correlation Between Corporate Financial and Social Performance

    Directory of Open Access Journals (Sweden)

    Wafaa Salah

    2016-12-01

    Full Text Available Recently, the corporate social performance (CSP is not less important than the corporate financial performance (CFP. Debate still exists about the nature of the relationship between the CSP and CFP, whether it is a positive, negative or a neutral correlation. The objective of this study is to explore the relationship between corporate social responsibility (CSR reports and CFP. The study uses the accounting-based and market-based quantitative measures to quantify the financial performance of seven organizations listed on the Egyptian Stock Exchange in 2007-2014. Then uses the information retrieval technologies to quantify the contribution of each of the three dimensions of the corporate social responsibility report (environmental, social and economic. Finally, the correlation between these two sets of variables is viewed together in a model to detect the correlations between them. This model is applied on seven firms that generate social responsibility reports. The results show a positive correlation between the Earnings per share (market-based measure and the economical dimension in the CSR report. On the other hand, total assets and property, plant and equipment (accounting-based measure are positively correlated to the environmental and social dimensions of the CSR reports. While there is not any significant relationship between ROA, ROE, Operating income and corporate social responsibility. This study contributes to the literature by providing more clarification of the relationship between CFP and the isolated CSR activities in a developing country.

  8. Rapid quantitative analysis of individual anthocyanin content based on high-performance liquid chromatography with diode array detection with the pH differential method.

    Science.gov (United States)

    Wang, Huayin

    2014-09-01

    A new quantitative technique for the simultaneous quantification of the individual anthocyanins based on the pH differential method and high-performance liquid chromatography with diode array detection is proposed in this paper. The six individual anthocyanins (cyanidin 3-glucoside, cyanidin 3-rutinoside, petunidin 3-glucoside, petunidin 3-rutinoside, and malvidin 3-rutinoside) from mulberry (Morus rubra) and Liriope platyphylla were used for demonstration and validation. The elution of anthocyanins was performed using a C18 column with stepwise gradient elution and individual anthocyanins were identified by high-performance liquid chromatography with tandem mass spectrometry. Based on the pH differential method, the high-performance liquid chromatography peak areas of maximum and reference absorption wavelengths of anthocyanin extracts were conducted to quantify individual anthocyanins. The calibration curves for these anthocyanins were linear within the range of 10-5500 mg/L. The correlation coefficients (r(2)) all exceeded 0.9972, and the limits of detection were in the range of 1-4 mg/L at a signal-to-noise ratio ≥5 for these anthocyanins. The proposed quantitative analysis was reproducible with good accuracy of all individual anthocyanins ranging from 96.3 to 104.2% and relative recoveries were in the range 98.4-103.2%. The proposed technique is performed without anthocyanin standards and is a simple, rapid, accurate, and economical method to determine individual anthocyanin contents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  10. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  11. A quantitative impact analysis of sensor failures on human operator's decision making in nuclear power plants

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    2004-01-01

    In emergency or accident situations in nuclear power plants, human operators take important roles in generating appropriate control signals to mitigate accident situation. In human reliability analysis (HRA) in the framework of probabilistic safety assessment (PSA), the failure probabilities of such appropriate actions are estimated and used for the safety analysis of nuclear power plants. Even though understanding the status of the plant is basically the process of information seeking and processing by human operators, it seems that conventional HRA methods such as THERP, HCR, and ASEP does not pay a lot of attention to the possibilities of providing wrong information to human operators. In this paper, a quantitative impact analysis of providing wrong information to human operators due to instrument faults or sensor failures is performed. The quantitative impact analysis is performed based on a quantitative situation assessment model. By comparing the situation in which there are sensor failures and the situation in which there are not sensor failures, the impact of sensor failures can be evaluated quantitatively. It is concluded that the impact of sensor failures are quite significant at the initial stages, but the impact is gradually reduced as human operators make more and more observations. Even though the impact analysis is highly dependent on the situation assessment model, it is expected that the conclusions made based on other situation assessment models with be consistent with the conclusion made in this paper. (author)

  12. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    Science.gov (United States)

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  13. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC

    Directory of Open Access Journals (Sweden)

    Cheng Bai

    2007-01-01

    Full Text Available High-performance liquid chromatography (HPLC analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates. Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml–1/μMol ml–1], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh. K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  14. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  15. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D. [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study.

  16. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    International Nuclear Information System (INIS)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D.

    2012-01-01

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study

  17. Qualitative and quantitative analysis of branches in dextran using high-performance anion exchange chromatography coupled to quadrupole time-of-flight mass spectrometry.

    Science.gov (United States)

    Yi, Lin; Ouyang, Yilan; Sun, Xue; Xu, Naiyu; Linhardt, Robert J; Zhang, Zhenqing

    2015-12-04

    Dextran, a family of natural polysaccharides, consists of an α (1→6) linked-glucose main (backbone) chain having a number of branches. The determination of the types and the quantities of branches in dextran is important in understanding its various biological roles. In this study, a hyphenated method using high-performance anion exchange chromatography (HPAEC) in parallel with pulsed amperometric detection (PAD) and mass spectrometry (MS) was applied to qualitative and quantitative analysis of dextran branches. A rotary cation-exchange cartridge array desalter was used for removal of salt from the HPAEC eluent making it MS compatible. MS and MS/MS were used to provide structural information on the enzymatically prepared dextran oligosaccharides. PAD provides quantitative data on the ratio of enzyme-resistant, branched dextran oligosaccharides. Both the types and degree of branching found in a variety of dextrans could be simultaneously determined online using this method. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  20. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  1. Quantitative analysis and design of a spray aerosol inhaler. Part 2: improvements in mouthpiece performance.

    Science.gov (United States)

    Hindle, Michael; Longest, P Worth

    2013-10-01

    The objective of this study was to utilize previously identified critical design attributes for the capillary aerosol generator as a model spray inhaler in order to develop a second-generation device that minimized aerosol drug deposition in the mouthpiece. Computational fluid dynamics (CFD) predictive analysis of the critical design attributes indicated that turbulence intensity should be reduced and the effective mouthpiece diameter should be increased. Two second-generation inhaler mouthpieces meeting these specifications were manufactured and tested. The first device (Design 1) implemented a larger cross-sectional area in the mouthpiece and streamlined flow, whereas the second device (Design 2) used a perforated mouthpiece wall. An in vitro deposition study was performed to quantify the deposition of drug mass in the mouthpieces and connected induction ports, and the results were compared with the CFD predictions. The two second-generation mouthpieces reduced in vitro aerosol deposition from the original value of 7.8% to values of 2.1% (Device 1) and 4.3% (Device 2), without largely altering the induction port deposition. This was achieved by design alterations aimed at reducing turbulence intensity and increasing the effective mouthpiece diameter. CFD model predictions were in good agreement with the in vitro experimental data. A second-generation spray inhaler mouthpiece with low drug deposition was developed using a predictive CFD model and in vitro experiments. Applying this quantitative analysis and design methodology to medical devices, which is similar to the Quality by Design paradigm, could provide significant advantages compared with traditional approaches.

  2. Computer-assisted sequential quantitative analysis of gallium scans in pulmonary sarcoidosis

    International Nuclear Information System (INIS)

    Rohatgi, P.K.; Bates, H.R.; Noss, R.W.

    1985-01-01

    Fifty-one sequential gallium citrate scans were performed in 22 patients with biopsy-proven sarcoidosis. A computer-assisted quantitative analysis of these scans was performed to obtain a gallium score. The changes in gallium score were correlated with changes in serum angiotensin converting enzyme (SACE) activity and objective changes in clinical status. There was a good concordance between changes in gallium score, SACE activity and clinical assessment in patients with sarcoidosis, and changes in gallium index were slightly superior to SACE index in assessing activity of sarcoidosis. (author)

  3. Qualitative and quantitative analysis of anthraquinones in rhubarbs by high performance liquid chromatography with diode array detector and mass spectrometry.

    Science.gov (United States)

    Wei, Shao-yin; Yao, Wen-xin; Ji, Wen-yuan; Wei, Jia-qi; Peng, Shi-qi

    2013-12-01

    Rhubarb is well known in traditional Chinese medicines (TCMs) mainly due to its effective purgative activity. Anthraquinones, including anthraquinone derivatives and their glycosides, are thought to be the major active components in rhubarb. To improve the quality control method of rhubarb, we studied on the extraction method, and did qualitative and quantitative analysis of widely used rhubarbs, Rheum tanguticum Maxim. ex Balf. and Rheum palmatum L., by HPLC-photodiode array detection (HPLC-DAD) and HPLC-mass spectrum (HPLC-MS) on a Waters SymmetryShield RP18 column (250 mm × 4.6 mm i.d., 5 μm). Amount of five anthraquinones was viewed as the evaluating standard. A standardized characteristic fingerprint of rhubarb was provided. From the quantitative analysis, the rationality was demonstrated for ancestors to use these two species of rhubarb equally. Under modern extraction methods, the amount of five anthraquinones in Rheum tanguticum Maxim. ex Balf. is higher than that in Rheum palmatum L. Among various extraction methods, ultrasonication with 70% methanol for 30 min is a promising one. For HPLC analysis, mobile phase consisted of methanol and 0.1% phosphoric acid in water with a gradient program, the detection wavelength at 280nm for fingerprinting analysis and 254 nm for quantitative analysis are good choices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Quantitative analysis of eyes and other optical systems in linear optics.

    Science.gov (United States)

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  5. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    Science.gov (United States)

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  6. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  7. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  8. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  9. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    Science.gov (United States)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  11. Quantitative analysis of phylloquinone (vitamin K1) in soy bean oils by high-performance liquid chromatography.

    Science.gov (United States)

    Zonta, F; Stancher, B

    1985-07-19

    A high-performance liquid chromatographic method for determining phylloquinone (vitamin K1) in soy bean oils is described. Resolution of vitamin K1 from interfering peaks of the matrix was obtained after enzymatic digestion, extraction and liquid-solid chromatography on alumina. An isocratic reversed-phase chromatography with UV detection was used in the final stage. The quantitation was carried out by the standard addition method, and the recovery of the whole procedure was 88.2%.

  12. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Visualization and quantitative analysis of the CSF pulsatile flow with cine MR phase imaging

    International Nuclear Information System (INIS)

    Katayama, Shinji; Itoh, Takahiko; Kinugasa, Kazushi; Asari, Shoji; Nishimoto, Akira; Tsuchida, Shohei; Ono, Atsushi; Ikezaki, Yoshikazu; Yoshitome, Eiji.

    1991-01-01

    The visualization and the quantitative analysis of the CSF pulsatile flow were performed on ten healthy volunteers with cine MR phase imaging, a combination of the phase-contrast technique and the cardiac-gating technique. The velocities appropriate for the visualization and the quantitative analysis of the CSF pulsatile flow were from 6.0 cm/sec to 15.0 cm/sec. The applicability of this method for the quantitative analysis was proven with a steady-flow phantom. Phase images clearly demonstrated a to-and-fro motion of the CSF flow in the anterior subarachnoid space and in the posterior subarachnoid space. The flow pattern of CSF on healthy volunteers depends on the cardiac cycle. In the anterior subarachnoid space, the cephalic CSF flow continued until a 70-msec delay after the R-wave of the ECG and then reversed to caudal. At 130-190 msec, the caudal CSF flow reached its maximum velocity; thereafter it reversed again to cephalic. The same turn appeared following the phase, but then the amplitude decreased. The cephalic peaked at 370-430 msec, while the caudal peaked at 490-550 msec. The flow pattern of the CSF flow in the posterior subarachnoid space was almost identical to that in the anterior subarachnoid space. Cine MR phase imaging is thus useful for the visualization and the quantitative analysis of the CSF pulsative flow. (author)

  14. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  15. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  16. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  17. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  18. Quantitative analysis of food and feed samples with droplet digital PCR.

    Directory of Open Access Journals (Sweden)

    Dany Morisset

    Full Text Available In this study, the applicability of droplet digital PCR (ddPCR for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs. Real-time quantitative polymerase chain reaction (qPCR is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  19. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  20. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  1. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  2. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Villot-Kadri, M.; Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Michel, K.; Bruyère, D.; Laperche, V. [BRGM, Service Métrologie, Monitoring et Analyse, 3 avenue Claude Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux1.fr [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France)

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced.

  3. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Villot-Kadri, M.; Ismaël, A.; Gallou, G.; Michel, K.; Bruyère, D.; Laperche, V.; Canioni, L.; Bousquet, B.

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced

  4. Analysis of characteristic performance curves in radiodiagnosis by an observer

    International Nuclear Information System (INIS)

    Kossovoj, A.L.

    1988-01-01

    Methods and ways of construction of performance characteristic curves (PX-curves) in roentgenology, their qualitative and quantitative estimation are described. Estimation of PX curves application for analysis of scintigraphic and sonographic images is presented

  5. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  6. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  7. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  8. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  9. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography

    International Nuclear Information System (INIS)

    Turmezei, Tom D.; Treece, Graham M.; Gee, Andrew H.; Fotiadou, Anastasia F.; Poole, Kenneth E.S.

    2016-01-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K and L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K and L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. (orig.)

  10. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  11. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  12. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  13. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Ling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); Smuts, Jonathan; Walsh, Phillip [VUV Analytics, Inc., Cedar Park, TX (United States); Qiu, Changling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); McNair, Harold M. [Department of Chemistry, Virginia Tech, Blacksburg, VA (United States); Schug, Kevin A., E-mail: kschug@uta.edu [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States)

    2017-02-08

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  14. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    International Nuclear Information System (INIS)

    Bai, Ling; Smuts, Jonathan; Walsh, Phillip; Qiu, Changling; McNair, Harold M.; Schug, Kevin A.

    2017-01-01

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  15. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    Science.gov (United States)

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  16. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    Science.gov (United States)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  17. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  18. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  19. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  20. An approach for quantitative evaluation of operator performance in emergency conditions

    International Nuclear Information System (INIS)

    Ujita, Hiroshi; Kubota, Ryuji; Kawano, Ryutaro.

    1992-01-01

    To understand expert behavior and define what constitutes good performance, human performance quantification was tried from viewpoints of not only error, but also various cognitive, psychological, and behavioral characteristics. Quantitative and qualitative indexes of human performance are proposed for both operator individual and crew points of view, among which cognitive and behavioral aspects are the most important. (author)

  1. Quantitative assessment of safety barrier performance in the prevention of domino scenarios triggered by fire

    International Nuclear Information System (INIS)

    Landucci, Gabriele; Argenti, Francesca; Tugnoli, Alessandro; Cozzani, Valerio

    2015-01-01

    The evolution of domino scenarios triggered by fire critically depends on the presence and the performance of safety barriers that may have the potential to prevent escalation, delaying or avoiding the heat-up of secondary targets. The aim of the present study is the quantitative assessment of safety barrier performance in preventing the escalation of fired domino scenarios. A LOPA (layer of protection analysis) based methodology, aimed at the definition and quantification of safety barrier performance in the prevention of escalation was developed. Data on the more common types of safety barriers were obtained in order to characterize the effectiveness and probability of failure on demand of relevant safety barriers. The methodology was exemplified with a case study. The results obtained define a procedure for the estimation of safety barrier performance in the prevention of fire escalation in domino scenarios. - Highlights: • We developed a methodology for the quantitative assessment of safety barriers. • We focused on safety barriers aimed at preventing domino effect triggered by fire. • We obtained data on effectiveness and availability of the safety barriers. • The methodology was exemplified with a case study of industrial interest. • The results showed the role of safety barriers in preventing fired domino escalation

  2. MEASURING ORGANIZATIONAL CULTURE: A QUANTITATIVE-COMPARATIVE ANALYSIS [doi: 10.5329/RECADM.20100902007

    Directory of Open Access Journals (Sweden)

    Valderí de Castro Alcântara

    2010-11-01

    Full Text Available This article aims at the analysis of the organizational culture at enterprises located in two towns with distinct quantitative traits, Rio Paranaíba and Araxá. While the surveyed enterprises in Rio Paranaíba are mostly micro and small enterprises (86%, in Araxá there are mostly medium and large companies (53%. The overall objective is to verify if there are significant differences in organizational culture among these enterprises and if they can be explained by the organization size. The research was quantitative and instruments for data collection were a questionnaire and a scale for measuring organizational culture containing four dimensions: Hierarchical Distance Index (IDH, Individualism Index (INDI, Masculinity Index (MASC and the Uncertainty Control Index (CINC. Tabulation and analysis of data were performed using the PASW Statistics 18, doing descriptive and inferential statistical procedures. Using a Reduction Factor (-21 the achieved indexes were classified into 5 intensity categories (from "very low" to "very high". The Student t test for two means was performed, revealing significant differences in Hierarchical Distance and Individualism between Araxá and Rio Paranaíba enterprises (p <0.05.   Keywords Organizational Culture; Dimensions of Organizational Culture; Araxá; Rio Paranaíba.

  3. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  4. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  5. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  6. Qualitative and Quantitative Analysis of Rhizoma Smilacis glabrae by Ultra High Performance Liquid Chromatography Coupled with LTQ OrbitrapXL Hybrid Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Shao-Dan Chen

    2014-07-01

    Full Text Available Rhizoma Smilacis glabrae, a traditional Chinese medicine (TCM as well as a functional food, has been commonly used for detoxification treatments, relieving dampness and as a diuretic. In order to quickly define the chemical profiles and control the quality of Smilacis glabrae, ultra high performance liquid chromatography coupled with electrospray ionization hybrid linear trap quadrupole orbitrap mass spectrometry (UHPLC-ESI/LTQ-Orbitrap-MS was applied for simultaneous identification and quantification of its bioactive constituents. A total of 56 compounds, including six new compounds, were identified or tentatively deduced on the basis of their retention behaviors, mass spectra, or by comparison with reference substances and literature data. The identified compounds belonged to flavonoids, phenolic acids and phenylpropanoid glycosides. In addition, an optimized UHPLC-ESI/LTQ-Orbitrap-MS method was established for quantitative determination of six marker compounds from five batches. The validation of the method, including linearity, sensitivity (LOQ, precision, repeatability and spike recoveries, was carried out and demonstrated to be satisfied the requirements of quantitative analysis. The results suggested that the established method would be a powerful and reliable analytical tool for the characterization of multi-constituent in complex chemical system and quality control of TCM.

  7. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  8. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  9. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  10. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  11. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  12. Quantitative performance allocation of multi-barrier system for high-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Ahn, Joon-Hong; Ikeda, Takao; Ohe, Toshiaki

    1995-01-01

    Performance assessment of each barrier consisting of geologic disposal system for high-level radioactive wastes is carried out quantitatively, and key radionuclides and parameters are pointed out. Chemical compositions and solubilities of radionuclides under repository conditions are determined by PHREEQE code staring from compositions of granitic groundwater observed in Japan. Glass dissolution analysis based on mass transfer theory and precipitation analysis have been done in order to determine the inner boundary condition for radionuclide diffusion through a bentonite-filled buffer region, where multi-member decay chain and isotopic sharing of solubility at the inner boundary are considered. Natural barrier is treated as homogeneous porous rock, or porous rock with infinite planar fractures. Performance of each barrier is evaluated in terms of non-dimensionalized hazard defined as the ratio of annual radioactivity release from each barrier to the annual limit on intake. At the outer edge of the engineered barriers, 239 Pu is the key unclide to the performance, whereas at the exit of the natural barrier, weakly-sorbing fission product nuclides such as 135 Cs, 129 I and 99 Tc dominate the hazard. (author) 50 refs

  13. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  14. Using quantitative image analysis to classify axillary lymph nodes on breast MRI: A new application for the Z 0011 Era

    Energy Technology Data Exchange (ETDEWEB)

    Schacht, David V., E-mail: dschacht@radiology.bsd.uchicago.edu; Drukker, Karen, E-mail: kdrukker@uchicago.edu; Pak, Iris, E-mail: irisgpak@gmail.com; Abe, Hiroyuki, E-mail: habe@radiology.bsd.uchicago.edu; Giger, Maryellen L., E-mail: m-giger@uchicago.edu

    2015-03-15

    Highlights: •Quantitative image analysis showed promise in evaluating axillary lymph nodes. •13 of 28 features performed better than guessing at metastatic status. •When all features were used in together, a considerably higher AUC was obtained. -- Abstract: Purpose: To assess the performance of computer extracted feature analysis of dynamic contrast enhanced (DCE) magnetic resonance images (MRI) of axillary lymph nodes. To determine which quantitative features best predict nodal metastasis. Methods: This institutional board-approved HIPAA compliant study, in which informed patient consent was waived, collected enhanced T1 images of the axilla from patients with breast cancer. Lesion segmentation and feature analysis were performed on 192 nodes using a laboratory-developed quantitative image analysis (QIA) workstation. The importance of 28 features were assessed. Classification used the features as input to a neural net classifier in a leave-one-case-out cross-validation and evaluated with receiver operating characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) values for features in the task of distinguishing between positive and negative nodes ranged from just over 0.50 to 0.70. Five features yielded AUCs greater than 0.65: two morphological and three textural features. In cross-validation, the neural net classifier obtained an AUC of 0.88 (SE 0.03) for the task of distinguishing between positive and negative nodes. Conclusion: QIA of DCE MRI demonstrated promising performance in discriminating between positive and negative axillary nodes.

  15. Quantitative analysis of pulmonary perfusion using time-resolved parallel 3D MRI - initial results

    International Nuclear Information System (INIS)

    Fink, C.; Buhmann, R.; Plathow, C.; Puderbach, M.; Kauczor, H.U.; Risse, F.; Ley, S.; Meyer, F.J.

    2004-01-01

    Purpose: to assess the use of time-resolved parallel 3D MRI for a quantitative analysis of pulmonary perfusion in patients with cardiopulmonary disease. Materials and methods: eight patients with pulmonary embolism or pulmonary hypertension were examined with a time-resolved 3D gradient echo pulse sequence with parallel imaging techniques (FLASH 3D, TE/TR: 0.8/1.9 ms; flip angle: 40 ; GRAPPA). A quantitative perfusion analysis based on indicator dilution theory was performed using a dedicated software. Results: patients with pulmonary embolism or chronic thromboembolic pulmonary hypertension revealed characteristic wedge-shaped perfusion defects at perfusion MRI. They were characterized by a decreased pulmonary blood flow (PBF) and pulmonary blood volume (PBV) and increased mean transit time (MTT). Patients with primary pulmonary hypertension or eisenmenger syndrome showed a more homogeneous perfusion pattern. The mean MTT of all patients was 3.3 - 4.7 s. The mean PBF and PBV showed a broader interindividual variation (PBF: 104-322 ml/100 ml/min; PBV: 8 - 21 ml/100 ml). Conclusion: time-resolved parallel 3D MRI allows at least a semi-quantitative assessment of lung perfusion. Future studies will have to assess the clinical value of this quantitative information for the diagnosis and management of cardiopulmonary disease. (orig.) [de

  16. Human performance analysis in the frame of probabilistic safety assessment of research reactors

    International Nuclear Information System (INIS)

    Farcasiu, Mita; Nitoi, Mirela; Apostol, Minodora; Turcu, I.; Florescu, Gh.

    2005-01-01

    Full text: The analysis of operating experience has identified the importance of human performance in reliability and safety of research reactors. In Probabilistic Safety Assessment (PSA) of nuclear facilities, human performance analysis (HPA) is used in order to estimate human error contribution to the failure of system components or functions. HPA is a qualitative and quantitative analysis of human actions identified for error-likely situations or accident-prone situations. Qualitative analysis is used to identify all man-machine interfaces that can lead to an accident, types of human interactions which may mitigate or exacerbate the accident, types of human errors and performance shaping factors. Quantitative analysis is used to develop estimates of human error probability as effects of human performance in reliability and safety. The goal of this paper is to accomplish a HPA in the PSA frame for research reactors. Human error probabilities estimated as results of human actions analysis could be included in system event tree and/or system fault tree. The achieved sensitivity analyses determine human performance sensibility at systematically variations both for dependencies level between human actions and for operator stress level. The necessary information was obtained from operating experience of research reactor TRIGA from INR Pitesti. The required data were obtained from generic data bases. (authors)

  17. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  19. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  20. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    Science.gov (United States)

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  1. An Analysis of the Effect of Quantitative and Qualitative Admissions Factors in Determining Student Performance at the U.S. Naval Academy

    National Research Council Canada - National Science Library

    Phillips, Barton

    2004-01-01

    .... The Candidate Multiple (CM) is the quantitative input to the admissions process derived from a statistics-based scoring model anchored in proven high school performance measures such as the SAT and high school GPA...

  2. Quantitative analysis of glycated albumin in serum based on ATR-FTIR spectrum combined with SiPLS and SVM.

    Science.gov (United States)

    Li, Yuanpeng; Li, Fucui; Yang, Xinhao; Guo, Liu; Huang, Furong; Chen, Zhenqiang; Chen, Xingdan; Zheng, Shifu

    2018-08-05

    A rapid quantitative analysis model for determining the glycated albumin (GA) content based on Attenuated total reflectance (ATR)-Fourier transform infrared spectroscopy (FTIR) combining with linear SiPLS and nonlinear SVM has been developed. Firstly, the real GA content in human serum was determined by GA enzymatic method, meanwhile, the ATR-FTIR spectra of serum samples from the population of health examination were obtained. The spectral data of the whole spectra mid-infrared region (4000-600 cm -1 ) and GA's characteristic region (1800-800 cm -1 ) were used as the research object of quantitative analysis. Secondly, several preprocessing steps including first derivative, second derivative, variable standardization and spectral normalization, were performed. Lastly, quantitative analysis regression models were established by using SiPLS and SVM respectively. The SiPLS modeling results are as follows: root mean square error of cross validation (RMSECV T ) = 0.523 g/L, calibration coefficient (R C ) = 0.937, Root Mean Square Error of Prediction (RMSEP T ) = 0.787 g/L, and prediction coefficient (R P ) = 0.938. The SVM modeling results are as follows: RMSECV T  = 0.0048 g/L, R C  = 0.998, RMSEP T  = 0.442 g/L, and R p  = 0.916. The results indicated that the model performance was improved significantly after preprocessing and optimization of characteristic regions. While modeling performance of nonlinear SVM was considerably better than that of linear SiPLS. Hence, the quantitative analysis model for GA in human serum based on ATR-FTIR combined with SiPLS and SVM is effective. And it does not need sample preprocessing while being characterized by simple operations and high time efficiency, providing a rapid and accurate method for GA content determination. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  4. Quantitative analysis of thorium-containing materials using an Industrial XRF analyzer

    International Nuclear Information System (INIS)

    Hasikova, J.; Titov, V.; Sokolov, A.

    2014-01-01

    Thorium (Th) as nuclear fuel is clean and safe and offers significant advantages over uranium. The technology for several types of thorium reactors is proven but still must be developed on a commercial scale. In the case of commercialization of thorium nuclear reactor thorium raw materials will be on demand. With this, mining and processing companies producing Th and rare earth elements will require prompt and reliable methods and instrumentation for Th quantitative on-line analysis. Potential applicability of X-ray fluorescence conveyor analyzer CON-X series is discussed for Th quantitative or semi-quantitative on-line measurement in several types of Th-bearing materials. Laboratory study of several minerals (zircon sands and limestone as unconventional Th resources; monazite concentrate as Th associated resources and uranium ore residues after extraction as a waste product) was performed and analyzer was tested for on-line quantitative measurements of Th contents along with other major and minor components. Th concentration range in zircon sand is 50-350 ppm; its detection limit at this level is estimated at 25- 50 ppm in 5 minute measurements depending on the type of material. On-site test of the CON-X analyzer for continuous analysis of thorium traces along with other elements in zircon sand showed that accuracy of Th measurements is within 20% relative. When Th content is higher than 1% as in the concentrate of monazite ore (5-8% ThO_2) accuracy of Th determination is within 1% relative. Although preliminary on-site test is recommended in order to address system feasibility at a large scale, provided results show that industrial conveyor XRF analyzer CON-X series can be effectively used for analytical control of mining and processing streams of Th-bearing materials. (author)

  5. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    International Nuclear Information System (INIS)

    Charland, P.; Peters, T.; McGill Univ., Montreal, Quebec

    1996-01-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer's perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions

  6. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  7. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    International Nuclear Information System (INIS)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi; Somerville, Robert A.; Kitamoto, Tetsuyuki; Mohri, Shirou

    2013-01-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products

  8. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan); Somerville, Robert A. [The Roslin Institute and Royal (Dick) School of Veterinary Studies, Roslin, Midlothian, EH25 9PS (United Kingdom); Kitamoto, Tetsuyuki [Division of CJD Science and Technology, Department of Prion Research, Center for Translational and Advanced Animal Research on Human Diseases, Tohoku University Graduate School of Medicine, 2-1 Seiryo, Aoba, Sendai 980-8575 (Japan); Mohri, Shirou, E-mail: shirou@affrc.go.jp [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan)

    2013-03-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products.

  9. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  10. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  11. Quantitative proteomic analysis of post-translational modifications of human histones

    DEFF Research Database (Denmark)

    Beck, Hans Christian; Nielsen, Eva C; Matthiesen, Rune

    2006-01-01

    , and H4 in a site-specific and dose-dependent manner. This unbiased analysis revealed that a relative increase in acetylated peptide from the histone variants H2A, H2B, and H4 was accompanied by a relative decrease of dimethylated Lys(57) from histone H2B. The dose-response results obtained...... by quantitative proteomics of histones from HDACi-treated cells were consistent with Western blot analysis of histone acetylation, cytotoxicity, and dose-dependent expression profiles of p21 and cyclin A2. This demonstrates that mass spectrometry-based quantitative proteomic analysis of post-translational...

  12. Hand Fatigue Analysis Using Quantitative Evaluation of Variability in Drawing Patterns

    Directory of Open Access Journals (Sweden)

    mohamadali Sanjari

    2015-02-01

    Full Text Available Background & aim: Muscle fatigue is defined as the reduced power generation capacity of a muscle or muscle group after activity which can lead to a variety of lesions. The purpose of the present study was to define the fatigue analysis by quantitative analysis using drawing patterns. Methods: the present cross-sectional study was conducted on 37 healthy volunteers (6 men and 31 women aged 18-30 years. Before & immediately after a fatigue protocol, quantitative assessment of hand drawing skills was performed by drawing repeated, overlapping, and concentric circles. The test was conducted in three sessions with an interval of 48-72 hours. Drawing was recorded by a digital tablet. Data were statistically analyzed using paired t-test and repeated measure ANOVA. Result: In drawing time series data analysis, at fatigue level of 100%, the variables standard deviation along x axis (SDx, standard deviation of velocity on both x and y axis (SDVx and SDVy and resultant vector velocity standard deviation (SDVR, showed significant differences after fatigue (P<0.05. In comparison of variables after the three fatigue levels, SDx showed significant difference (P<0.05. Conclusions: structurally full fatigue showed significant differences with other levels of fatigue, so it contributed to significant variability in drawing parameters. The method used in the present study recognized the fatigue in high frequency motion as well.

  13. Quantitative analysis and prediction of regional lymph node status in rectal cancer based on computed tomography imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Chunyan; Liu, Lizhi; Li, Li [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Imaging Diagnosis and Interventional Center, Cancer Center, Guangzhou, Guangdong (China); Cai, Hongmin; Tian, Haiying [Sun Yat-Sen University, Department of Automation, School of Science Information and Technology, Guangzhou (China); Li, Liren [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Department of Abdominal (colon and rectal) Surgery, Cancer Center, Guangzhou (China)

    2011-11-15

    To quantitatively evaluate regional lymph nodes in rectal cancer patients by using an automated, computer-aided approach, and to assess the accuracy of this approach in differentiating benign and malignant lymph nodes. Patients (228) with newly diagnosed rectal cancer, confirmed by biopsy, underwent enhanced computed tomography (CT). Patients were assigned to the benign node or malignant node group according to histopathological analysis of node samples. All CT-detected lymph nodes were segmented using the edge detection method, and seven quantitative parameters of each node were measured. To increase the prediction accuracy, a hierarchical model combining the merits of the support and relevance vector machines was proposed to achieve higher performance. Of the 220 lymph nodes evaluated, 125 were positive and 95 were negative for metastases. Fractal dimension obtained by the Minkowski box-counting approach was higher in malignant nodes than in benign nodes, and there was a significant difference in heterogeneity between metastatic and non-metastatic lymph nodes. The overall performance of the proposed model is shown to have accuracy as high as 88% using morphological characterisation of lymph nodes. Computer-aided quantitative analysis can improve the prediction of node status in rectal cancer. (orig.)

  14. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  15. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  16. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  17. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  18. Qualitative and quantitative analysis of an alkaloid fraction from Piper longum L. using ultra-high performance liquid chromatography-diode array detector-electrospray ionization mass spectrometry.

    Science.gov (United States)

    Li, Kuiyong; Fan, Yunpeng; Wang, Hui; Fu, Qing; Jin, Yu; Liang, Xinmiao

    2015-05-10

    In a previous research, an alkaloid fraction and 18 alkaloid compounds were prepared from Piper longum L. by series of purification process. In this paper, a qualitative and quantitative analysis method using ultra-high performance liquid chromatography-diode array detector-mass spectrometry (UHPLC-DAD-MS) was developed to evaluate the alkaloid fraction. Qualitative analysis of the alkaloid fraction was firstly completed by UHPLC-DAD method and 18 amide alkaloid compounds were identified. A further qualitative analysis of the alkaloid fraction was accomplished by UHPLC-MS/MS method. Another 25 amide alkaloids were identified according to their characteristic ions and neutral losses. At last, a quantitative method for the alkaloid fraction was established using four marker compounds including piperine, pipernonatine, guineensine and N-isobutyl-2E,4E-octadecadienamide. After the validation of this method, the contents of above four marker compounds in the alkaloid fraction were 57.5mg/g, 65.6mg/g, 17.7mg/g and 23.9mg/g, respectively. Moreover, the relative response factors of other three compounds to piperine were calculated. A comparative study between external standard quantification and relative response factor quantification proved no remarkable difference. UHPLC-DAD-MS method was demonstrated to be a powerful tool for the characterization of the alkaloid fraction from P. longum L. and the result proved that the quality of alkaloid fraction was efficiently improved after appropriate purification. Copyright © 2015. Published by Elsevier B.V.

  19. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  20. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Wu

    Full Text Available ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  1. Separation and quantitation of polyethylene glycols 400 and 3350 from human urine by high-performance liquid chromatography.

    Science.gov (United States)

    Ryan, C M; Yarmush, M L; Tompkins, R G

    1992-04-01

    Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.

  2. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  3. Quantitative analysis of Al-Si alloy using calibration free laser induced breakdown spectroscopy (CF-LIBS)

    Science.gov (United States)

    Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali

    2017-06-01

    The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.

  4. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  5. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  6. Quantitative Analysis of Micro-CT Imaging and Histopathological Signatures of Experimental Arthritis in Rats

    Directory of Open Access Journals (Sweden)

    Matthew D. Silva

    2004-10-01

    Full Text Available Micro-computed tomographic (micro-CT imaging provides a unique opportunity to capture 3-D architectural information in bone samples. In this study of pathological joint changes in a rat model of adjuvant-induced arthritis (AA, quantitative analysis of bone volume and roughness were performed by micro-CT imaging and compared with histopathology methods and paw swelling measurement. Micro-CT imaging of excised rat hind paws (n = 10 stored in formalin consisted of approximately 600 30-μm slices acquired on a 512 × 512 image matrix with isotropic resolution. Following imaging, the joints were scored from H&E stained sections for cartilage/bone erosion, pannus development, inflammation, and synovial hyperplasia. From micro-CT images, quantitative analysis of absolute bone volumes and bone roughness was performed. Bone erosion in the rat AA model is substantial, leading to a significant decline in tarsal volume (27%. The result of the custom bone roughness measurement indicated a 55% increase in surface roughness. Histological and paw volume analyses also demonstrated severe arthritic disease as compared to controls. Statistical analyses indicate correlations among bone volume, roughness, histology, and paw volume. These data demonstrate that the destructive progression of disease in a rat AA model can be quantified using 3-D micro-CT image analysis, which allows assessment of arthritic disease status and efficacy of experimental therapeutic agents.

  7. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  8. Quantitative analysis of trivalent uranium and lanthanides in a molten chloride by absorption spectrophotometry

    International Nuclear Information System (INIS)

    Toshiyuki Fujii; Akihiro Uehara; Hajimu Yamana

    2013-01-01

    As an analytical application for pyrochemical reprocessing using molten salts, quantitative analysis of uranium and lanthanides by UV/Vis/NIR absorption spectrophotometry was performed. Electronic absorption spectra of LiCl-KCl eutectic at 773 K including trivalent uranium and eight rare earth elements (Y, La, Ce, Pr, Nd, Sm, Eu, and Gd as fission product elements) were measured in the wavenumber region of 4,500-33,000 cm -1 . The composition of the solutes was simulated for a reductive extraction condition in a pyroreprocessing process for spent nuclear fuels, that is, about 2 wt% U and 0.1-2 wt% rare earth elements. Since U(III) possesses strong absorption bands due to f-d transitions, an optical quartz cell with short light path length of 1 mm was adopted in the analysis. The quantitative analysis of trivalent U, Nd, Pr, and Sm was possible with their f-f transition intensities in the NIR region. The analytical results agree with the prepared concentrations within 2σ experimental uncertainties. (author)

  9. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  10. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    Science.gov (United States)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  11. Analysis of Economic Performance in Mergers and Acquisition

    Institute of Scientific and Technical Information of China (English)

    王立杰; 孙涛

    2003-01-01

    Based on the methods of financial analysis, the direct earnings in mergers and acquisition M&A, profit or loss from stock price fluctuation, influence on the earning per stock(EPS) and revenue growth after M&A were analyzed in detail. And several quantitative models were established in relevant part accordingly. It can be useful to improve the present low efficiency in the M&A performance in Chinese capital market.

  12. Improving Student Retention and Performance in Quantitative Courses Using Clickers

    Science.gov (United States)

    Liu, Wallace C.; Stengel, Donald N.

    2011-01-01

    Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…

  13. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  14. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    Science.gov (United States)

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (Panalysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  15. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    Science.gov (United States)

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    Science.gov (United States)

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (Pvalue for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of

  17. Network analysis of quantitative proteomics on asthmatic bronchi: effects of inhaled glucocorticoid treatment

    Directory of Open Access Journals (Sweden)

    Sihlbom Carina

    2011-09-01

    Full Text Available Abstract Background Proteomic studies of respiratory disorders have the potential to identify protein biomarkers for diagnosis and disease monitoring. Utilisation of sensitive quantitative proteomic methods creates opportunities to determine individual patient proteomes. The aim of the current study was to determine if quantitative proteomics of bronchial biopsies from asthmatics can distinguish relevant biological functions and whether inhaled glucocorticoid treatment affects these functions. Methods Endobronchial biopsies were taken from untreated asthmatic patients (n = 12 and healthy controls (n = 3. Asthmatic patients were randomised to double blind treatment with either placebo or budesonide (800 μg daily for 3 months and new biopsies were obtained. Proteins extracted from the biopsies were digested and analysed using isobaric tags for relative and absolute quantitation combined with a nanoLC-LTQ Orbitrap mass spectrometer. Spectra obtained were used to identify and quantify proteins. Pathways analysis was performed using Ingenuity Pathway Analysis to identify significant biological pathways in asthma and determine how the expression of these pathways was changed by treatment. Results More than 1800 proteins were identified and quantified in the bronchial biopsies of subjects. The pathway analysis revealed acute phase response signalling, cell-to-cell signalling and tissue development associations with proteins expressed in asthmatics compared to controls. The functions and pathways associated with placebo and budesonide treatment showed distinct differences, including the decreased association with acute phase proteins as a result of budesonide treatment compared to placebo. Conclusions Proteomic analysis of bronchial biopsy material can be used to identify and quantify proteins using highly sensitive technologies, without the need for pooling of samples from several patients. Distinct pathophysiological features of asthma can be

  18. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  19. Quantitative and qualitative analysis of the expert and non-expert opinion in fire risk in buildings

    International Nuclear Information System (INIS)

    Hanea, D.M.; Jagtman, H.M.; Alphen, L.L.M.M. van; Ale, B.J.M.

    2010-01-01

    Expert judgment procedure is a method very often used in the area of risk assessments of complex systems or processes to fill in quantitative data. Although it has been proved to be a very reliable source of information when no other data are available, the choice of experts is always questioned. When the available data are limited, the seed questions cover only partially the domains of expertise, which may cause problems. Expertise is assessed not covering the full object of study but only those topics for which seed questions can be formulated. The commonly used quantitative analysis of an expert judgment exercise is combined with a qualitative analysis. The latter adds more insights to the relation between the assessor's field and statistical knowledge and their performance in an expert judgment. In addition the qualitative analysis identifies different types of seed questions. Three groups of assessors with different levels of statistical and domain knowledge are studied. The quantitative analysis shows no differences between field experts and non-experts and no differences between having advanced statistical knowledge or not. The qualitative analysis supports these findings. In addition it is found that especially technical questions are answered with larger intervals. Precaution is required when using seed questions for which the real value can be calculated, which was the case for one of the seed questions.

  20. Performance values of nondestructive analysis techniques in safeguards and nuclear materials management

    International Nuclear Information System (INIS)

    Guardini, S.

    1989-01-01

    Nondestructive assay (NDA) techniques have, in the past few years, become more and more important in nuclear material accountancy and control. This is essentially due to two reasons: (1) The improvements made in most NDA techniques led some of them to have performances close to destructive analysis (DA) (e.g., calorimetry and gamma spectrometry). (2) The parallel improvement of statistical tools and procedural inspection approaches led to abandoning the following scheme: (a) NDA for semiqualitative or consistency checks only (b) DA for quantitative measurements. As a consequence, NDA is now frequently used in scenarios that involve quantitative (by variable) analysis. On the other hand, it also became evident that the performances of some techniques were different depending on whether they were applied in the laboratory or in the field. It has only recently been realized that, generally speaking, this is due to objective reasons rather than to an incorrect application of the instruments. Speaking of claimed and actual status of NDA performances might be in this sense misleading; one should rather say: performances in different conditions. This paper provides support for this assumption

  1. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    Science.gov (United States)

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  3. Quantitative analysis of exercise 201Tl myocardial emission CT in patients with coronary artery disease

    International Nuclear Information System (INIS)

    Okada, Mitsuhiro; Kawai, Naoki; Yamamoto, Shuhei

    1984-01-01

    The clinical usefulness of quantitative analysis of exercise thallium-201 myocardial emission computed tomography (ECT) was evaluated in coronary artery disease (CAD). The subjects consisted of 20 CAD patients and five normal controls. All CAD patients underwent coronary angiography. Tomographic thallium-201 myocardial imaging was performed with a rotating gamma camera, and long-axial and short-axial myocardial images of the left ventricle were reconstructed. The tomographic images were interpreted quantitatively using circumferential profile analysis. Based on features of regional myocardial thallium-201 kinetics, two types of abnormalities were studied: (1) diminished initial distribution (stress defect) and (2) slow washout of thallium-201, as evidenced by patients' initial thallium-201 uptake and 3-hour washout rate profiles which fell below the normal limits, respectively. Two diagnostic criteria including the stress defect and a combination of the stress defect and slow washout were used to detect coronary artery lesions of significance (>=75 % luminal narrowing). The ischemic volumes were also evaluated by quantitative analysis using thallium-201 ECT. The diagnostic accuracy of the stress defect criterion was 95 % for left anterior descending, 90 % for right, and 70 % for left circumflex coronary artery lesions. The combined criteria of the stress defect and slow washout increased detection sensitivity with a moderate loss of specificity for identifying individual coronary artery lesion. A relatively high diagnostic accuracy was obtained using the stress defect criterion for multiple vessel disease (75 %). Ischemic myocardial volume was significantly larger in triple vessel than in single vessel disease (p < 0.05) using the combined criteria. It was concluded that quantitative analysis of exercise thallium-201 myocardial ECT images proves useful for evaluating coronary artery lesions. (author)

  4. A simultaneous screening and quantitative method for the multiresidue analysis of pesticides in spices using ultra-high performance liquid chromatography-high resolution (Orbitrap) mass spectrometry.

    Science.gov (United States)

    Goon, Arnab; Khan, Zareen; Oulkar, Dasharath; Shinde, Raviraj; Gaikwad, Suresh; Banerjee, Kaushik

    2018-01-12

    A novel screening and quantitation method is reported for non-target multiresidue analysis of pesticides using ultra-HPLC-quadrupole-Orbitrap mass spectrometry in spice matrices, including black pepper, cardamom, chili, coriander, cumin, and turmeric. The method involved sequential full-scan (resolution = 70,000), and variable data independent acquisition (vDIA) with nine consecutive fragmentation events (resolution = 17,500). Samples were extracted by the QuEChERS method. The introduction of an SPE-based clean-up step through hydrophilic-lipophilic-balance (HLB) cartridges proved advantageous in minimizing the false negatives. For coriander, cumin, chili, and cardamom, the screening detection limit was largely at 2 ng/g, while it was 5 ng/g for black pepper, and turmeric. When the method was quantitatively validated for 199 pesticides, the limit of quantification (LOQ) was mostly at 10 ng/g (excluding black pepper, and turmeric with LOQ = 20 ng/g) with recoveries within 70-120%, and precision-RSDs <20%. Furthermore, the method allowed the identification of suspected non-target analytes through retrospective search of the accurate mass of the compound-specific precursor and product ions. Compared to LC-MS/MS, the quantitative performance of this Orbitrap-MS method had agreements in residue values between 78-100%. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Methodology for quantitative evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.

    1981-01-01

    Of various approaches that might be taken to the diagnostic performance evaluation problem, Receiver Operating Characteristic (ROC) analysis holds great promise. Further development of the methodology for a unified, objective, and meaningful approach to evaluating the usefulness of medical imaging procedures is done by consideration of statistical significance testing, optimal sequencing of correlated studies, and analysis of observer performance

  6. Qualitative and quantitative laser-induced breakdown spectroscopy of bronze objects

    International Nuclear Information System (INIS)

    Tankova, V; Blagoev, K; Grozeva, M; Malcheva, G; Penkova, P

    2016-01-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique for qualitative and quantitative elemental analysis of solids, liquids and gases. In this work, the method was applied for investigation of archaeological bronze objects. The analytical information obtained by LIBS was used for qualitative determination of the elements in the material used for manufacturing of the objects under study. Quantitative chemical analysis was also performed after generating calibration curves with standard samples of similar matrix composition. Quantitative estimation of the elemental concentration of the bulk of the samples was performed, together with investigation of the surface layer of the objects. The results of the quantitative analyses gave indications about the manufacturing process of the investigated objects. (paper)

  7. Quantitative Proteomic Analysis of the Response to Zinc, Magnesium, and Calcium Deficiency in Specific Cell Types of Arabidopsis Roots

    Directory of Open Access Journals (Sweden)

    Yoichiro Fukao

    2016-01-01

    Full Text Available The proteome profiles of specific cell types have recently been investigated using techniques such as fluorescence activated cell sorting and laser capture microdissection. However, quantitative proteomic analysis of specific cell types has not yet been performed. In this study, to investigate the response of the proteome to zinc, magnesium, and calcium deficiency in specific cell types of Arabidopsis thaliana roots, we performed isobaric tags for relative and absolute quantification (iTRAQ-based quantitative proteomics using GFP-expressing protoplasts collected by fluorescence-activated cell sorting. Protoplasts were collected from the pGL2-GFPer and pMGP-GFPer marker lines for epidermis or inner cell lines (pericycle, endodermis, and cortex, respectively. To increase the number of proteins identified, iTRAQ-labeled peptides were separated into 24 fractions by OFFGFEL electrophoresis prior to high-performance liquid chromatography coupled with mass spectrometry analysis. Overall, 1039 and 737 proteins were identified and quantified in the epidermal and inner cell lines, respectively. Interestingly, the expression of many proteins was decreased in the epidermis by mineral deficiency, although a weaker effect was observed in inner cell lines such as the pericycle, endodermis, and cortex. Here, we report for the first time the quantitative proteomics of specific cell types in Arabidopsis roots.

  8. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  9. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course.

    Science.gov (United States)

    Flanagan, K M; Einarson, J

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre-post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student's math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student's grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide "instructor actions" from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. © 2017 K. M. Flanagan and J. Einarson. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http

  10. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  11. Quantitative Analysis and Comparison of Four Major Flavonol Glycosides in the Leaves of Toona sinensis (A. Juss.) Roemer (Chinese Toon) from Various Origins by High-Performance Liquid Chromatography-Diode Array Detector and Hierarchical Clustering Analysis

    Science.gov (United States)

    Sun, Xiaoxiang; Zhang, Liting; Cao, Yaqi; Gu, Qinying; Yang, Huan; Tam, James P.

    2016-01-01

    Background: Toona sinensis (A. Juss.) Roemer is an endemic species of Toona genus native to Asian area. Its dried leaves are applied in the treatment of many diseases; however, few investigations have been reported for the quantitative analysis and comparison of major bioactive flavonol glycosides in the leaves harvested from various origins. Objective: To quantitatively analyze four major flavonol glycosides including rutinoside, quercetin-3-O-β-D-glucoside, quercetin-3-O-α-L-rhamnoside, and kaempferol-3-O-α-L-rhamnoside in the leaves from different production sites and classify them according to the content of these glycosides. Materials and Methods: A high-performance liquid chromatography-diode array detector (HPLC-DAD) method for their simultaneous determination was developed and validated for linearity, precision, accuracy, stability, and repeatability. Moreover, the method established was then employed to explore the difference in the content of these four glycosides in raw materials. Finally, a hierarchical clustering analysis was performed to classify 11 voucher specimens. Results: The separation was performed on a Waters XBridge Shield RP18 column (150 mm × 4.6 mm, 3.5 μm) kept at 35°C, and acetonitrile and H2O containing 0.30% trifluoroacetic acid as mobile phase was driven at 1.0 mL/min during the analysis. Ten microliters of solution were injected and 254 nm was selected to monitor the separation. A strong linear relationship between the peak area and concentration of four analytes was observed. And, the method was also validated to be repeatable, stable, precise, and accurate. Conclusion: An efficient and reliable HPLC-DAD method was established and applied in the assays for the samples from 11 origins successfully. Moreover, the content of those flavonol glycosides varied much among different batches, and the flavonoids could be considered as biomarkers to control the quality of Chinese Toon. SUMMARY Four major flavonol glycosides in the leaves

  12. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  13. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  14. Quantitative and pattern recognition analysis of five marker compounds in Raphani semen using high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Jung, Yeon Woo; Lee, Joo Sang; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun; Kim, Jeong Ah; Eun, Rhan Woo

    2015-01-01

    A rapid and simple high-performance liquid chromatography (HPLC)-photodiode array (PDA) analytical method was developed for the quantitative analysis of Raphani Semen (RS). This method was successfully used to determine the five main phenolic compounds found in RS specimens from different production regions. The compounds included sinapine thiocyanate (1), β-d-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (2), isorhamnetin 3,4′-di-O-β-d-glucoside (3), β-d-(3-O-sinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (4), and β-d-(3,4-O-disinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (5). The marker compounds were separated using an Agilent Eclipse XDB-C18 column (5.0 µm, 150 × 4.6 mm i.d.) by gradient elution with acetonitrile/water/0.1% trifluoroacetic acid (TFA) as the mobile phase (flow rate, 1.0 mL/min). This method was fully validated with respect to linearity, precision, accuracy, stability, and robustness. The HPLC analytical method was validated to conduct a pattern recognition analysis by repeatedly analyzing 56 seed samples including 55 RS (C01–C49 and K50–K55) and 1 Brassicae Semen samples. In addition, a content standard for RS was proposed. Compounds 1 and 4 were revealed as major components in the HPLC chromatogram, and their contents ranged from 0.06 to 0.20 and 0.02 to 0.35 mg/g, respectively. These results demonstrate the successful development of an analytical method suitable for evaluating the quality and distinguishing the origin of RS. In addition, we briefly describe the crucial liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical conditions for the precise simultaneous quantification of the marker compounds

  15. Quantitative and pattern recognition analysis of five marker compounds in Raphani semen using high-performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Yeon Woo; Lee, Joo Sang; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun [College of Pharmacy, Drug Research and Development Center, Catholic University of Daegu, Gyeongsan (Korea, Republic of); Kim, Jeong Ah [College of Pharmacy, Research Institute of Pharmaceutical Sciences, Kyungpook National University, Daegu (Korea, Republic of); Eun, Rhan Woo [College of Pharmacy, Chosun University, Gwangju (Korea, Republic of)

    2015-09-15

    A rapid and simple high-performance liquid chromatography (HPLC)-photodiode array (PDA) analytical method was developed for the quantitative analysis of Raphani Semen (RS). This method was successfully used to determine the five main phenolic compounds found in RS specimens from different production regions. The compounds included sinapine thiocyanate (1), β-d-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (2), isorhamnetin 3,4′-di-O-β-d-glucoside (3), β-d-(3-O-sinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (4), and β-d-(3,4-O-disinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (5). The marker compounds were separated using an Agilent Eclipse XDB-C18 column (5.0 µm, 150 × 4.6 mm i.d.) by gradient elution with acetonitrile/water/0.1% trifluoroacetic acid (TFA) as the mobile phase (flow rate, 1.0 mL/min). This method was fully validated with respect to linearity, precision, accuracy, stability, and robustness. The HPLC analytical method was validated to conduct a pattern recognition analysis by repeatedly analyzing 56 seed samples including 55 RS (C01–C49 and K50–K55) and 1 Brassicae Semen samples. In addition, a content standard for RS was proposed. Compounds 1 and 4 were revealed as major components in the HPLC chromatogram, and their contents ranged from 0.06 to 0.20 and 0.02 to 0.35 mg/g, respectively. These results demonstrate the successful development of an analytical method suitable for evaluating the quality and distinguishing the origin of RS. In addition, we briefly describe the crucial liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical conditions for the precise simultaneous quantification of the marker compounds.

  16. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  17. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  18. Quantitative schemes in energy dispersive X-ray fluorescence implemented in AXIL

    International Nuclear Information System (INIS)

    Tchantchane, A.; Benamar, M.A.; Tobbeche, S.

    1995-01-01

    E.D.X.R.F (Energy Dispersive X-ray Fluorescence) has long been used for quantitative analysis of many types of samples including environment samples. the software package AXIL (Analysis of x-ray spectra by iterative least quares) is extensively used for the spectra analysis and the quantification of x-ray spectra. It includes several methods of quantitative schemes for evaluating element concentrations. We present the general theory behind each scheme implemented into the software package. The spectra of the performance of each of these quantitative schemes. We have also investigated their performance relative to the uncertainties in the experimental parameters and sample description

  19. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  20. Role of image analysis in quantitative characterisation of nuclear fuel materials

    International Nuclear Information System (INIS)

    Dubey, J.N.; Rao, T.S.; Pandey, V.D.; Majumdar, S.

    2005-01-01

    Image analysis is one of the important techniques, widely used for materials characterization. It provides the quantitative estimation of the microstructural features present in the material. This information is very much valuable for finding out the criteria for taking up the fuel for high burn up. Radiometallurgy Division has been carrying out development and fabrication of plutonium related fuels for different type of reactors viz. Purnima, Fast Breeder Test Reactor (FBTR), Prototype Fast Breeder Reactor (PFBR), Boiling Water Reactor (BWR), Advanced Heavy Water Reactor (AHWR), Pressurised Heavy Water Reactor (PHWR) and KAMINI Reactor. Image analysis has been carried out on microstructures of PHWR, AHWR, FBTR and KAMINI fuels. Samples were prepared as per standard ASTM metallographic procedure. Digital images of the microstructure of these specimens were obtained using CCD camera, attached to the optical microscope. These images are stores on computer and used for detection and analysis of features of interest with image analysis software. Quantitative image analysis technique has been standardised and used for finding put type of the porosity, its size, shape and distribution in the above sintered oxide and carbide fuels. This technique has also been used for quantitative estimation of different phases present in KAMINI fuel. Image analysis results have been summarised and presented in this paper. (author)

  1. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  2. Quantitative determination of reserpine, ajmaline, and ajmalicine in Rauvolfia serpentina by reversed-phase high-performance liquid chromatography.

    Science.gov (United States)

    Srivastava, A; Tripathi, A K; Pandey, R; Verma, R K; Gupta, M M

    2006-10-01

    A sensitive and reproducible reversed-phase high-performance liquid chromatography (HPLC) method using photodiode array detection is established for the simultaneous quantitation of important root alkaloids of Rauvolfia serpentina, namely, reserpine, ajmaline, and ajmalicine. A Chromolith Performance RP-18e column (100 x 4.6-mm i.d.) and a binary gradient mobile phase composed of 0.01 M (pH 3.5) phosphate buffer (NaH(2)PO(4)) containing 0.5% glacial acetic acid and acetonitrile are used. Analysis is run at a flow rate of 1.0 mL/min with the detector operated at a wavelength of 254 nm. The calibration curves are linear over a concentration range of 1-20 microg/mL (r = 1.000) for all the alkaloids. The various other aspects of analysis (i.e., peak purity, similarity, recovery, and repeatability) are also validated. For the three components, the recoveries are found to be 98.27%, 97.03%, and 98.38%, respectively. The limits of detection are 6, 4, and 8 microg/mL for ajmaline, ajmalicine, and reserpine, respectively, and the limits of quantitation are 19, 12, and 23 microg/mL for ajmaline, ajmalicine, and reserpine, respectively. The developed method is simple, reproducible, and easy to operate. It is useful for the evaluation of R. serpentina.

  3. Quantitative twoplex glycan analysis using 12C6 and 13C6 stable isotope 2-aminobenzoic acid labelling and capillary electrophoresis mass spectrometry.

    Science.gov (United States)

    Váradi, Csaba; Mittermayr, Stefan; Millán-Martín, Silvia; Bones, Jonathan

    2016-12-01

    Capillary electrophoresis (CE) offers excellent efficiency and orthogonality to liquid chromatographic (LC) separations for oligosaccharide structural analysis. Combination of CE with high resolution mass spectrometry (MS) for glycan analysis remains a challenging task due to the MS incompatibility of background electrolyte buffers and additives commonly used in offline CE separations. Here, a novel method is presented for the analysis of 2-aminobenzoic acid (2-AA) labelled glycans by capillary electrophoresis coupled to mass spectrometry (CE-MS). To ensure maximum resolution and excellent precision without the requirement for excessive analysis times, CE separation conditions including the concentration and pH of the background electrolyte, the effect of applied pressure on the capillary inlet and the capillary length were evaluated. Using readily available 12/13 C 6 stable isotopologues of 2-AA, the developed method can be applied for quantitative glycan profiling in a twoplex manner based on the generation of extracted ion electropherograms (EIE) for 12 C 6 'light' and 13 C 6 'heavy' 2-AA labelled glycan isotope clusters. The twoplex quantitative CE-MS glycan analysis platform is ideally suited for comparability assessment of biopharmaceuticals, such as monoclonal antibodies, for differential glycomic analysis of clinical material for potential biomarker discovery or for quantitative microheterogeneity analysis of different glycosylation sites within a glycoprotein. Additionally, due to the low injection volume requirements of CE, subsequent LC-MS analysis of the same sample can be performed facilitating the use of orthogonal separation techniques for structural elucidation or verification of quantitative performance.

  4. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  5. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  6. Aspects of quantitative secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Grauer, R.

    1982-05-01

    Parameters which have an influence on the formation of secondary ions by ion bombardment of a solid matrix are discussed. Quantitative SIMS-analysis with the help of calibration standards necessitates a stringent control of these parameters. This is particularly valid for the oxygen partial pressure which for metal analysis has to be maintained constant also under ultra high vacuum. The performance of the theoretical LTE-model (Local Thermal Equilibrium) using internal standards will be compared with the analysis with the help of external standards. The LTE-model does not satisfy the requirements for quantitative analysis. (Auth.)

  7. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.

  8. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    Science.gov (United States)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  9. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Science.gov (United States)

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  10. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  11. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy-EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  12. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  13. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  14. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  15. Pilot study of quantitative analysis of background enhancement on breast MR images: association with menstrual cycle and mammographic breast density.

    Science.gov (United States)

    Scaranelo, Anabel M; Carrillo, Maria Claudia; Fleming, Rachel; Jacks, Lindsay M; Kulkarni, Supriya R; Crystal, Pavel

    2013-06-01

    To perform semiautomated quantitative analysis of the background enhancement (BE) in a cohort of patients with newly diagnosed breast cancer and to correlate it with mammographic breast density and menstrual cycle. Informed consent was waived after the research ethics board approved this study. Results of 177 consecutive preoperative breast magnetic resonance (MR) examinations performed from February to December 2009 were reviewed; 147 female patients (median age, 48 years; range, 26-86 years) were included. Ordinal values of BE and breast density were described by two independent readers by using the Breast Imaging Reporting and Data System lexicon. The BE coefficient (BEC) was calculated thus: (SI2 · 100/SI1) - 100, where SI is signal intensity, SI2 is the SI enhancement measured in the largest anteroposterior dimension in the axial plane 1 minute after the contrast agent injection, and SI1is the SI before contrast agent injection. BEC was used for the quantitative analysis of BE. Menstrual cycle status was based on the last menstrual period. The Wilcoxon rank-sum or Kruskal-Wallis test was used to compare quantitative assessment groups. Cohen weighted κ was used to evaluate agreement. Of 147 patients, 68 (46%) were premenopausal and 79 (54%) were postmenopausal. The quantitative BEC was associated with the menstrual status (BEC in premenopausal women, 31.48 ± 20.68 [standard deviation]; BEC in postmenopausal women, 25.65 ± 16.74; P = .02). The percentage of overall BE was higher when the MR imaging was performed in women in the inadequate phase of the cycle (quantitative BE than postmenopausal women. No association was found between BE and breast density.

  16. Critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P T; McCulloch, J [Glasgow Univ. (UK)

    1983-06-13

    Semi-quantitative analysis (e.g. optical density ratios) of (/sup 14/C)2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of /sup 14/C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of /sup 14/C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of (/sup 14/C)2-deoxyglucose autoradiograms is undertaken.

  17. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  18. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  19. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  20. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  1. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  2. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  3. Quantitative analysis of perfumes in talcum powder by using headspace sorptive extraction.

    Science.gov (United States)

    Ng, Khim Hui; Heng, Audrey; Osborne, Murray

    2012-03-01

    Quantitative analysis of perfume dosage in talcum powder has been a challenge due to interference of the matrix and has so far not been widely reported. In this study, headspace sorptive extraction (HSSE) was validated as a solventless sample preparation method for the extraction and enrichment of perfume raw materials from talcum powder. Sample enrichment is performed on a thick film of poly(dimethylsiloxane) (PDMS) coated onto a magnetic stir bar incorporated in a glass jacket. Sampling is done by placing the PDMS stir bar in the headspace vial by using a holder. The stir bar is then thermally desorbed online with capillary gas chromatography-mass spectrometry. The HSSE method is based on the same principles as headspace solid-phase microextraction (HS-SPME). Nevertheless, a relatively larger amount of extracting phase is coated on the stir bar as compared to SPME. Sample amount and extraction time were optimized in this study. The method has shown good repeatability (with relative standard deviation no higher than 12.5%) and excellent linearity with correlation coefficients above 0.99 for all analytes. The method was also successfully applied in the quantitative analysis of talcum powder spiked with perfume at different dosages. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Quantum dots assisted laser desorption/ionization mass spectrometric detection of carbohydrates: qualitative and quantitative analysis.

    Science.gov (United States)

    Bibi, Aisha; Ju, Huangxian

    2016-04-01

    A quantum dots (QDs) assisted laser desorption/ionization mass spectrometric (QDA-LDI-MS) strategy was proposed for qualitative and quantitative analysis of a series of carbohydrates. The adsorption of carbohydrates on the modified surface of different QDs as the matrices depended mainly on the formation of hydrogen bonding, which led to higher MS intensity than those with conventional organic matrix. The effects of QDs concentration and sample preparation method were explored for improving the selective ionization process and the detection sensitivity. The proposed approach offered a new dimension to the application of QDs as matrices for MALDI-MS research of carbohydrates. It could be used for quantitative measurement of glucose concentration in human serum with good performance. The QDs served as a matrix showed the advantages of low background, higher sensitivity, convenient sample preparation and excellent stability under vacuum. The QDs assisted LDI-MS approach has promising application to the analysis of carbohydrates in complex biological samples. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Fibrous dysplasia of the cranial vault: quantitative analysis based on neural networks

    International Nuclear Information System (INIS)

    Arana, E.; Marti-Bonmati, L.; Paredes, R.; Molla, E.

    1998-01-01

    To assess the utility of statistical analysis and neural networks in the quantitative analysis of fibrous dysplasia of the cranial vault. Ten patients with fibrous dysplasia (six women and four men with a mean age of 23.60±17.85 years) were selected from a series of 167 patients with lesions of the cranial vault evaluated by plain radiography and computed tomography (CT). Nineteen variables were taken from their medical records and radiological study. Their characterization was based on statistical analysis and neural network, and was validated by means of the leave-one-out method. The performance of the neural network was estimated by means of receiver operating characteristics (ROC) curves, using as a parameter the area under the curve A z . Bivariate analysis identified age, duration of symptoms, lytic and sclerotic patterns, sclerotic margin, ovoid shape, soft-tissue mas and periosteal reaction as significant variables. The area under the neural network curve was 0.9601±0.0435. The network selected the matrix and soft-tissue mass a variables that were indispensable for diagnosis. The neural network presents a high performance in the characterization of fibrous dysplasia of the cranial vault, disclosing occult interactions among the variables. (Author) 24 refs

  6. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Quantitative X-ray analysis of biological fluids: the microdroplet technique

    International Nuclear Information System (INIS)

    Roinel, N.

    1988-01-01

    X-ray microanalysis can be used to quantitatively determine the elemental composition of microvolumes of biological fluids. This article describes the various steps in preparation of microdroplets for analysis: The manufacturing of micropipettes, the preparation of the specimen support, the deposition of droplets on the support, shock-freezing, and lyophilization. Examples of common artifacts (incomplete rehydration prior to freezing or partial rehydration after lyophilization) are demonstrated. Analysis can be carried out either by wavelength-dispersive analysis, which is the most sensitive method, or by energy-dispersive analysis, which is more commonly available. The minimum detectable concentration is 0.05 mmol.liter-1 for 0.1-nl samples analyzed by wavelength-dispersive spectrometry and 0.5-1 mmol.liter-1 for samples analyzed by energy-dispersive spectrometry. A major problem, especially in wavelength-dispersive analysis, where high beam currents are used, is radiation damage to the specimen; in particular chloride (but also other elements) can be lost. Quantitative analysis requires the use of standard solutions with elemental concentration in the same range as those present in the specimen

  8. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  9. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  10. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  11. Quantitative analysis of biological fluids by electron probe and X ray spectrometry

    International Nuclear Information System (INIS)

    Girod, Chantal

    1986-01-01

    In order to know the kidney normal operation and to have an insight on cellular transport mechanisms and hormonal regulations at the nephron level, a technique based on the use of an electron probe has been developed for the elemental analysis of micro-volumes of biological fluids. This academic document reports applications of this technique on animals on which such fluids have been sampled at different levels of the nephron. As these samples are available in too small volumes to be dosed by conventional methods, they have been quantitatively analysed by using an electronic probe based analyser in order to determine concentrations of all elements with an atomic number greater than that of carbon. After a presentation of the implemented method and hardware, the author thus describes how an analysis is performed, and reports and discusses an example (analysis conditions, data acquisition, data processing, minimum detectable concentration, reasons for measurement scattering)

  12. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  13. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  14. Quantitative analysis of phases by x-ray diffraction and thermogravimetry in Cuban phosphorite ores

    International Nuclear Information System (INIS)

    Casanova Gomez, Abdel; Martinez Montalvo, Asor; Cilano Campos, Guillermo; Arostegui Aguirre, Miladys; Ferreiro Fernandez, Adalyz; Alonso Perez, Jose A.

    2016-01-01

    Phases analysis is performed by instrumental techniques X - ray diffraction and Thermal Analysis in two groups of samples of Cuban minerals carriers'phosphorus, candidates to reference materials. To this end, the variant of structural refinement of the diffraction pattern in the form of adjustment profile is applied, using the Full prof program of Juan Rodriguez-Carvajal. This analysis is the first step to develop the standard specification of these resources and classify them as phosphate rock and / or phospharite from their mass content. The statistical evaluation of the uncertainty of the quantitative analysis (standard deviation) was carried out in ten replicate samples of phosphate rock and eight of phosphate from the field Trinidad de Guedes. The qualitative phase analysis reflected the following phase composition: carbonate fluoroapatite (CFA), Calcite, Quartz and Halloysite (present only in the clayey granular phosphorite ore; FGA). By the method of setting pattern powder diffraction profile, the quantitative phase composition is reported in the sample FGA: 87 (2) % of CFA, 4 (1) % of Calcite, 1% Quartz, and 8 (3) % Halloysite. For granular limestone ore (FGC), the following contents were obtained: 87 (3) % Calcite, 8 (3) % of CFA and 5 (1) % Quartz: The obtained values are corroborated by Thermogravimetric Analysis (TG) through the calculation of the mass content of the thermally active phases (Calcite and CFA) in the range (27-10000 0 C), confirming the validity of the results of XRD. (Author)

  15. Performance analysis, quality function deployment and structured methods

    Science.gov (United States)

    Maier, M. W.

    Quality function deployment, (QFD), an approach to synthesizing several elements of system modeling and design into a single unit, is presented. Behavioral, physical, and performance modeling are usually considered as separate aspects of system design without explicit linkages. Structured methodologies have developed linkages between behavioral and physical models before, but have not considered the integration of performance models. QFD integrates performance models with traditional structured models. In this method, performance requirements such as cost, weight, and detection range are partitioned into matrices. Partitioning is done by developing a performance model, preferably quantitative, for each requirement. The parameters of the model become the engineering objectives in a QFD analysis and the models are embedded in a spreadsheet version of the traditional QFD matrices. The performance model and its parameters are used to derive part of the functional model by recognizing that a given performance model implies some structure to the functionality of the system.

  16. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    Science.gov (United States)

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p quantitative BPE (r = 0.63, p Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Towards quantitative laser-induced breakdown spectroscopy analysis of soil samples

    International Nuclear Information System (INIS)

    Bousquet, B.; Sirven, J.-B.; Canioni, L.

    2007-01-01

    A quantitative analysis of chromium in soil samples is presented. Different emission lines related to chromium are studied in order to select the best one for quantitative features. Important matrix effects are demonstrated from one soil to the other, preventing any prediction of concentration in different soils on the basis of a univariate calibration curve. Finally, a classification of the LIBS data based on a series of Principal Component Analyses (PCA) is applied to a reduced dataset of selected spectral lines related to the major chemical elements in the soils. LIBS data of heterogeneous soils appear to be widely dispersed, which leads to a reconsideration of the sampling step in the analysis process

  18. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  19. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  20. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  1. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  2. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  3. Quantitative Analysis of Bone Scintigrams at the Korle-Bu Teaching Hospital

    International Nuclear Information System (INIS)

    Huguette, E.Y.Y.

    2012-01-01

    Qualitative method of diagnosis has been the traditional means of diagnosing bone tumours at the Nuclear Medicine Department of the Korle-Bu Teaching Hospital over the years. Although this method is commendable, a more accurate diagnostic means is the quantitative approach. Study on ninety-five patients undergoing bone scans has been performed quantitatively using image J. The patients were administered with activity ranging from 15 to 30 mCi depending on their weights, and were then scanned with an installed e.Cam SPECT system. A 256 x 1024 matrix size was used in acquiring the bone scans. Quantitative analyses performed with the image J, revealed that uptake levels in all selected body parts were higher for metastatic tumours compared to non-metastatic tumours. The average normalised uptake in the recorded metastatic cases was 1.37332 cts/mm 2 /mCi and the corresponding uptake in the non-metastatic cases was 0.85230 cts/mm 2 /mCi. The relative higher uptake in metastatic tumours is attributed to high osteoblastic activity and blood flow in metastatic cases compared to non-metastatic cases. Quantitative assessment of bone scintigrams is recommended for its high accuracy and quicker means of diagnosing.(author)

  4. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  5. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    Science.gov (United States)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  6. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  7. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  8. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  9. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    Directory of Open Access Journals (Sweden)

    Marielle Ernst

    Full Text Available We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists.We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model. Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed.In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling.Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  10. Pancreaticobiliary duct changes of periampullary carcinomas: Quantitative analysis at MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Dong Sheng, E-mail: victoryhope@163.com [Department of Radiology, West China Hospital of Sichuan University, Chengdu, Sichuan 610041 (China); Department of Radiology, No.4 West China Teaching Hospital of Sichuan University, Chengdu 610041 (China); Chen, Wei Xia, E-mail: wxchen25@126.com [Department of Radiology, West China Hospital of Sichuan University, Chengdu, Sichuan 610041 (China); Wang, Xiao Dong, E-mail: tyfs03yz@163.com [Department of Radiology, West China Hospital of Sichuan University, Chengdu, Sichuan 610041 (China); Acharya, Riwaz, E-mail: riwaz007@hotmail.com [Department of Radiology, West China Hospital of Sichuan University, Chengdu, Sichuan 610041 (China); Jiang, Xing Hua, E-mail: 13881865517@163.com [Department of Pathology, West China Hospital of Sichuan University, Chengdu, Sichuan 610041 (China)

    2012-09-15

    Purpose: To quantitatively analyse the pancreaticobiliary duct changes of periampullary carcinomas with volumetric interpolated breath-hold examination (VIBE) and true fast imaging with steady-state precession (true FISP) sequence, and investigate the value of these findings in differentiation and preoperative evaluation. Materials and methods: Magnetic resonance (MR) images of 71 cases of periampullary carcinomas (34 cases of pancreatic head carcinoma, 16 cases of intrapancreatic bile duct carcinoma and 21 cases of ampullary carcinoma) confirmed histopathologically were analysed. The maximum diameter of the common bile duct (CBD) and main pancreatic duct (MPD), dilated pancreaticobiliary duct angle and the distance from the end of the proximal dilated pancreaticobiliary duct to the major papilla were measured. Analysis of variance and the Chi-squared test were performed. Results: These findings showed significant differences among the three subtypes: the distance from the end of proximal dilated pancreaticobiliary duct to the major papilla and pancreaticobiliary duct angle. The distance and the pancreaticobiliary duct angle were least for ampullary carcinoma among the three subtypes. The percentage of dilated CBD was 94.1%, 93.8%, and 100% for pancreatic head carcinoma, intrapancreatic bile duct carcinoma and ampullary carcinoma, respectively. And that for the dilated MPD was 58.8%, 43.8%, and 42.9%, respectively. Conclusion: Quantitative analysis of the pancreaticobiliary ductal system can provide accurate and objective assessment of the pancreaticobiliary duct changes. Although benefit in differential diagnosis is limited, these findings are valuable in preoperative evaluation for both radical resection and palliative surgery.

  11. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    Science.gov (United States)

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  12. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  13. Quantitative assessment of finger motor performance: Normative data.

    Directory of Open Access Journals (Sweden)

    Alessio Signori

    Full Text Available Finger opposition movements are the basis of many daily living activities and are essential in general for manipulating objects; an engineered glove quantitatively assessing motor performance during sequences of finger opposition movements has been shown to be useful to provide reliable measures of finger motor impairment, even subtle, in subjects affected by neurological diseases. However, the obtained behavioral parameters lack published reference values.To determine mean values for different motor behavioral parameters describing the strategy adopted by healthy people in performing repeated sequences of finger opposition movements, examining associations with gender and age.Normative values for finger motor performance parameters were obtained on a sample of 255 healthy volunteers executing sequences of finger-to-thumb opposition movements, stratified by gender and over a wide range of ages. Touch duration, inter-tapping interval, movement rate, correct sequences (%, movements in advance compared with a metronome (% and inter-hand interval were assessed.Increasing age resulted in decreased movement speed, advance movements with respect to a cue, correctness of sequences, and bimanual coordination. No significant performance differences were found between male and female subjects except for the duration of the finger touch, the interval between two successive touches and their ratio.We report age- and gender-specific normal mean values and ranges for different parameters objectively describing the performance of finger opposition movement sequences, which may serve as useful references for clinicians to identify possible deficits in subjects affected by diseases altering fine hand motor skills.

  14. QUANTITATIVE EEG COMPARATIVE ANALYSIS BETWEEN AUTISM SPECTRUM DISORDER (ASD AND ATTENTION DEFICIT HYPERACTIVITY DISORDER (ADHD

    Directory of Open Access Journals (Sweden)

    Plamen D. Dimitrov

    2017-01-01

    Full Text Available Background: Autism is a mental developmental disorder, manifested in the early childhood. Attention deficit hyperactivity disorder is another psychiatric condition of the neurodevelopmental type. Both disorders affect information processing in the nervous system, altering the mechanisms which control how neurons and their synapses are connected and organized. Purpose: To examine if quantitative EEG assessment is sensitive and simple enough to differentiate autism from attention deficit hyperactivity disorder and neurologically typical children. Material and methods: Quantitative EEG is a type of electrophysiological assessment that uses computerized mathematical analysis to convert the raw waveform data into different frequency ranges. Each frequency range is averaged across a sample of data and quantified into mean amplitude (voltage in microvolts mV. We performed quantitative EEG analysis and compared 4 cohorts of children (aged from 3 to 7 years: with autism (high [n=27] and low [n=52] functioning, with attention deficit hyperactivity disorder [n=34], and with typical behavior [n75]. Results: Our preliminary results show that there are significant qEEG differences between the groups of patients and the control cohort. The changes affect the potential levels of delta-, theta-, alpha-, and beta- frequency spectrums. Conclusion: The present study shows some significant quantitative EEG findings in autistic patients. This is a step forward in our efforts, aimed at defining specific neurophysiologic changes, in order to develop and refine strategies for early diagnosis of autism spectrum disorders, differentiation from other development conditions in childhood, detection of specific biomarkers and early initiation of treatment.

  15. A quantitative performance evaluation of the EM algorithm applied to radiographic images

    International Nuclear Information System (INIS)

    Brailean, J.C.; Sullivan, B.J.; Giger, M.L.; Chen, C.T.

    1991-01-01

    In this paper, the authors quantitatively evaluate the performance of the Expectation Maximization (EM) algorithm as a restoration technique for radiographic images. The perceived signal-to-noise ratio (SNR), of simple radiographic patterns processed by the EM algorithm are calculated on the basis of a statistical decision theory model that includes both the observer's visual response function and a noise component internal to the eye-brain system. The relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to quantitatively compare the effects of the EM algorithm to two popular image enhancement techniques: contrast enhancement (windowing) and unsharp mask filtering

  16. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  17. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  18. Quantitative magnetometry analysis and structural characterization of multisegmented cobalt–nickel nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Cantu-Valle, Jesus [Department of Physics and Astronomy, University of Texas at San Antonio, One UTSA Circle, San Antonio, TX 78249 (United States); Díaz Barriga-Castro, Enrique [Centro de Investigación de Ciencias Físico Matemáticas/Facultad de Ciencias Físico Matemáticas, Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Vega, Víctor; García, Javier [Departamento de Física, Universidad de Oviedo, Calvo Sotelo s/n, Oviedo 33007 (Spain); Mendoza-Reséndez, Raquel [Facultad de Ingeniería Mecánica y Eléctrica. Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Luna, Carlos [Centro de Investigación de Ciencias Físico Matemáticas/Facultad de Ciencias Físico Matemáticas, Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Manuel Prida, Víctor [Departamento de Física, Universidad de Oviedo, Calvo Sotelo s/n, Oviedo 33007 (Spain); and others

    2015-04-01

    Understanding and measuring the magnetic properties of an individual nanowire and their relationship with crystalline structure and geometry are of scientific and technological great interest. In this work, we report the localized study of the magnetic flux distribution and the undisturbed magnetization of a single ferromagnetic nanowire that poses a bar-code like structure using off-axis electron holography (EH) under Lorentz conditions. The nanowires were grown by template-assisted electrodeposition, using AAO templates. Electron holography allows the visualization of the magnetic flux distribution within and surroundings as well as its quantification. The magnetic analysis performed at individual nanowires was correlated with the chemical composition and crystalline orientation of the nanowires. - Highlights: • The structure-magnetic property relationship of CoNi nanowires is determined. • Off axis electron holography for the magnetic nanowires is used for the analysis. • The magnetization is quantitatively obtained from the retrieved phase images. • These results lead to a better comprehension of the magneto-crystalline phenomena.

  19. Quantitative magnetometry analysis and structural characterization of multisegmented cobalt–nickel nanowires

    International Nuclear Information System (INIS)

    Cantu-Valle, Jesus; Díaz Barriga-Castro, Enrique; Vega, Víctor; García, Javier; Mendoza-Reséndez, Raquel; Luna, Carlos; Manuel Prida, Víctor

    2015-01-01

    Understanding and measuring the magnetic properties of an individual nanowire and their relationship with crystalline structure and geometry are of scientific and technological great interest. In this work, we report the localized study of the magnetic flux distribution and the undisturbed magnetization of a single ferromagnetic nanowire that poses a bar-code like structure using off-axis electron holography (EH) under Lorentz conditions. The nanowires were grown by template-assisted electrodeposition, using AAO templates. Electron holography allows the visualization of the magnetic flux distribution within and surroundings as well as its quantification. The magnetic analysis performed at individual nanowires was correlated with the chemical composition and crystalline orientation of the nanowires. - Highlights: • The structure-magnetic property relationship of CoNi nanowires is determined. • Off axis electron holography for the magnetic nanowires is used for the analysis. • The magnetization is quantitatively obtained from the retrieved phase images. • These results lead to a better comprehension of the magneto-crystalline phenomena

  20. A novel quantitative approach for eliminating sample-to-sample variation using a hue saturation value analysis program.

    Science.gov (United States)

    Yabusaki, Katsumi; Faits, Tyler; McMullen, Eri; Figueiredo, Jose Luiz; Aikawa, Masanori; Aikawa, Elena

    2014-01-01

    As computing technology and image analysis techniques have advanced, the practice of histology has grown from a purely qualitative method to one that is highly quantified. Current image analysis software is imprecise and prone to wide variation due to common artifacts and histological limitations. In order to minimize the impact of these artifacts, a more robust method for quantitative image analysis is required. Here we present a novel image analysis software, based on the hue saturation value color space, to be applied to a wide variety of histological stains and tissue types. By using hue, saturation, and value variables instead of the more common red, green, and blue variables, our software offers some distinct advantages over other commercially available programs. We tested the program by analyzing several common histological stains, performed on tissue sections that ranged from 4 µm to 10 µm in thickness, using both a red green blue color space and a hue saturation value color space. We demonstrated that our new software is a simple method for quantitative analysis of histological sections, which is highly robust to variations in section thickness, sectioning artifacts, and stain quality, eliminating sample-to-sample variation.

  1. Quantitative and qualitative analysis of semantic verbal fluency in patients with temporal lobe epilepsy.

    Science.gov (United States)

    Jaimes-Bautista, A G; Rodríguez-Camacho, M; Martínez-Juárez, I E; Rodríguez-Agudelo, Y

    2017-08-29

    Patients with temporal lobe epilepsy (TLE) perform poorly on semantic verbal fluency (SVF) tasks. Completing these tasks successfully involves multiple cognitive processes simultaneously. Therefore, quantitative analysis of SVF (number of correct words in one minute), conducted in most studies, has been found to be insufficient to identify cognitive dysfunction underlying SVF difficulties in TLE. To determine whether a sample of patients with TLE had SVF difficulties compared with a control group (CG), and to identify the cognitive components associated with SVF difficulties using quantitative and qualitative analysis. SVF was evaluated in 25 patients with TLE and 24 healthy controls; the semantic verbal fluency test included 5 semantic categories: animals, fruits, occupations, countries, and verbs. All 5 categories were analysed quantitatively (number of correct words per minute and interval of execution: 0-15, 16-30, 31-45, and 46-60seconds); the categories animals and fruits were also analysed qualitatively (clusters, cluster size, switches, perseverations, and intrusions). Patients generated fewer words for all categories and intervals and fewer clusters and switches for animals and fruits than the CG (Psize and number of intrusions and perseverations (P>.05). Our results suggest an association between SVF difficulties in TLE and difficulty activating semantic networks, impaired strategic search, and poor cognitive flexibility. Attention, inhibition, and working memory are preserved in these patients. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  3. Qualitative and Quantitative Analysis of Lignan Constituents in Caulis Trachelospermi by HPLC-QTOF-MS and HPLC-UV

    Directory of Open Access Journals (Sweden)

    Xiao-Ting Liu

    2015-05-01

    Full Text Available A high-performance liquid chromatography coupled with quadrupole tandem time-of-flight mass (HPLC-QTOF-MS and ultraviolet spectrometry (HPLC-UV was established for simultaneous qualitative and quantitative analysis of the major chemical constituents in Caulis Trachelospermi, respectively. The analysis was performed on an Agilent Zorbax Eclipse Plus C18 column (4.6 mm × 150 mm, 5 μm using a binary gradient system of water and methanol, with ultraviolet absorption at 230 nm. Based on high-resolution ESI-MS/MS fragmentation behaviors of the reference standards, the characteristic cleavage patterns of lignano-9, 9'-lactones and lignano-8'-hydroxy-9, 9'-lactones were obtained. The results demonstrated that the characteristic fragmentation patterns are valuable for identifying and differentiating lignano-9,9'-lactones and lignano-8'-hydroxy-9,9'-lactones. As such, a total of 25 compounds in Caulis Trachelospermi were unambiguously or tentatively identified via comparisons with reference standards or literature. In addition, 14 dibenzylbutyrolatone lignans were simultaneously quantified in Caulis Trachelospermi by HPLC-UV method. The method is suitable for the qualitative and quantitative analyses of dibenzylbutyrolatone lignans in Caulis Trachelospermi.

  4. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  5. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    Science.gov (United States)

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

  6. A critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    International Nuclear Information System (INIS)

    Kelly, P.T.; McCulloch, J.

    1983-01-01

    Semi-quantitative analysis (e.g. optical density ratios) of [ 14 C]2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of 14 C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of 14 C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of [ 14 C]2-deoxyglucose autoradiograms is undertaken. (Auth.)

  7. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  8. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  9. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  10. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  11. Semi-quantitative evaluation of gallium-67 scintigraphy in lupus nephritis

    International Nuclear Information System (INIS)

    Lin Wanyu; Hsieh Jihfang; Tsai Shihchuan; Lan Joungliang; Cheng Kaiyuan; Wang Shyhjen

    2000-01-01

    Within nuclear medicine there is a trend towards quantitative analysis. Gallium renal scan has been reported to be useful in monitoring the disease activity of lupus nephritis. However, only visual interpretation using a four-grade scale has been performed in previous studies, and this method is not sensitive enough for follow-up. In this study, we developed a semi-quantitative method for gallium renal scintigraphy to find a potential parameter for the evaluation of lupus nephritis. Forty-eight patients with lupus nephritis underwent renal biopsy to determine World Health Organization classification, activity index (AI) and chronicity index (CI). A delayed 48-h gallium scan was also performed and interpreted by visual and semi-quantitative methods. For semi-quantitative analysis of the gallium uptake in both kidneys, regions of interest (ROIs) were drawn over both kidneys, the right forearm and the adjacent spine. The uptake ratios between these ROIs were calculated and expressed as the ''kidney/spine ratio (K/S ratio)'' or the ''kidney/arm ratio (K/A ratio)''. Spearman's rank correlation test and Mann-Whitney U test were used for statistical analysis. Our data showed a good correlation between the semi-quantitative gallium scan and the results of visual interpretation. K/S ratios showed a better correlation with AI than did K/A ratios. Furthermore, the left K/S ratio displayed a better correlation with AI than did the right K/S ratio. In contrast, CI did not correlate well with the results of semi-quantitative gallium scan. In conclusion, semi-quantitative gallium renal scan is easy to perform and shows a good correlation with the results of visual interpretation and renal biopsy. The left K/S ratio from semi-quantitative renal gallium scintigraphy displays the best correlation with AI and is a useful parameter in evaluating the disease activity in lupus nephritis. (orig.)

  12. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  13. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  14. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  16. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  17. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  18. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  19. Quantitative data analysis with SPSS release 8 for Windows a guide for social scientists

    CERN Document Server

    Bryman, Alan

    2002-01-01

    The latest edition of this best-selling introduction to Quantitative Data Analysis through the use of a computer package has been completely updated to accommodate the needs of users of SPSS Release 8 for Windows. Like its predecessor, it provides a non-technical approach to quantitative data analysis and a user-friendly introduction to the widely used SPSS for Windows. It assumes no previous familiarity with either statistics or computing but takes the reader step-by-step through the techniques, reinforced by exercises for further practice. Techniques explained in Quantitative Data Analysis with SPSS Release 8 for Windows include: * correlation * simple and multiple regression * multivariate analysis of variance and covariance * factor analysis The book also covers issues such as sampling, statistical significance, conceptualization and measurement and the selection of appropriate tests. For further information or to download the book's datasets, please visit the webstite: http://www.routledge.com/textbooks/...

  20. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  1. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  2. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  3. Quantitative Clinical Diagnostic Analysis of Acetone in Human Blood by HPLC: A Metabolomic Search for Acetone as Indicator

    Directory of Open Access Journals (Sweden)

    Esin Akgul Kalkan

    2016-01-01

    Full Text Available Using high-performance liquid chromatography (HPLC and 2,4-dinitrophenylhydrazine (2,4-DNPH as a derivatizing reagent, an analytical method was developed for the quantitative determination of acetone in human blood. The determination was carried out at 365 nm using an ultraviolet-visible (UV-Vis diode array detector (DAD. For acetone as its 2,4-dinitrophenylhydrazone derivative, a good separation was achieved with a ThermoAcclaim C18 column (15 cm × 4.6 mm × 3 μm at retention time (tR 12.10 min and flowrate of 1 mL min−1 using a (methanol/acetonitrile water elution gradient. The methodology is simple, rapid, sensitive, and of low cost, exhibits good reproducibility, and allows the analysis of acetone in biological fluids. A calibration curve was obtained for acetone using its standard solutions in acetonitrile. Quantitative analysis of acetone in human blood was successfully carried out using this calibration graph. The applied method was validated in parameters of linearity, limit of detection and quantification, accuracy, and precision. We also present acetone as a useful tool for the HPLC-based metabolomic investigation of endogenous metabolism and quantitative clinical diagnostic analysis.

  4. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    O' Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    2017-05-01

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.

  5. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance.

    Science.gov (United States)

    O'Daniel, Jennifer C; Yin, Fang-Fang

    2017-05-01

    To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  7. A quantitative analysis of municipal solid waste disposal charges in China.

    Science.gov (United States)

    Wu, Jian; Zhang, Weiqian; Xu, Jiaxuan; Che, Yue

    2015-03-01

    Rapid industrialization and economic development have caused a tremendous increase in municipal solid waste (MSW) generation in China. China began implementing a policy of MSW disposal fees for household waste management at the end of last century. Three charging methods were implemented throughout the country: a fixed disposal fee, a potable water-based disposal fee, and a plastic bag-based disposal fee. To date, there has been little qualitative or quantitative analysis on the effectiveness of this relatively new policy. This paper provides a general overview of MSW fee policy in China, attempts to verify whether the policy is successful in reducing general waste collected, and proposes an improved charging system to address current problems. The paper presents an empirical statistical analysis of policy effectiveness derived from an environmental Kuznets curve (EKC) test on panel data of China. EKC tests on different kinds of MSW charge systems were then examined for individual provinces or cities. A comparison of existing charging systems was conducted using environmental and economic criteria. The results indicate the following: (1) the MSW policies implemented over the study period were effective in the reduction of waste generation, (2) the household waste discharge fee policy did not act as a strong driver in terms of waste prevention and reduction, and (3) the plastic bag-based disposal fee appeared to be performing well according to qualitative and quantitative analysis. Based on current situation of waste discharging management in China, a three-stage transitional charging scheme is proposed and both advantages and drawbacks discussed. Evidence suggests that a transition from a fixed disposal fee to a plastic bag-based disposal fee involving various stakeholders should be the next objective of waste reduction efforts.

  8. Using Performance Tasks to Improve Quantitative Reasoning in an Introductory Mathematics Course

    Science.gov (United States)

    Kruse, Gerald; Drews, David

    2013-01-01

    A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant…

  9. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  10. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    Science.gov (United States)

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  11. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    International Nuclear Information System (INIS)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.; Hoffmann, R.G.; Palmer, D.W.; Glatt, S.L.; Antuono, P.G.; Isitman, A.T.; Papke, R.A.

    1989-01-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determined from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake

  12. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  13. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  14. Quantitative X ray analysis system. User's manual and guide to X ray fluorescence technique

    International Nuclear Information System (INIS)

    2009-01-01

    This guide covers trimmed and re-arranged version 3.6 of the Quantitative X ray Analysis System (QXAS) software package that includes the most frequently used methods of quantitative analysis. QXAS is a comprehensive quantitative analysis package that has been developed by the IAEA through research and technical contracts. Additional development has also been carried out in the IAEA Laboratories in Seibersdorf where QXAS was extensively tested. New in this version of the manual are the descriptions of the Voigt-profile peak fitting, the backscatter fundamental parameters' and emission-transmission methods of chemical composition analysis, an expanded chapter on the X ray fluorescence physics, and completely revised and increased number of practical examples of utilization of the QXAS software package. The analytical data accompanying this manual were collected in the IAEA Seibersdorf Laboratories in the years 2006/2007

  15. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  16. Quantitative diagnosis of bladder cancer by morphometric analysis of HE images

    Science.gov (United States)

    Wu, Binlin; Nebylitsa, Samantha V.; Mukherjee, Sushmita; Jain, Manu

    2015-02-01

    In clinical practice, histopathological analysis of biopsied tissue is the main method for bladder cancer diagnosis and prognosis. The diagnosis is performed by a pathologist based on the morphological features in the image of a hematoxylin and eosin (HE) stained tissue sample. This manuscript proposes algorithms to perform morphometric analysis on the HE images, quantify the features in the images, and discriminate bladder cancers with different grades, i.e. high grade and low grade. The nuclei are separated from the background and other types of cells such as red blood cells (RBCs) and immune cells using manual outlining, color deconvolution and image segmentation. A mask of nuclei is generated for each image for quantitative morphometric analysis. The features of the nuclei in the mask image including size, shape, orientation, and their spatial distributions are measured. To quantify local clustering and alignment of nuclei, we propose a 1-nearest-neighbor (1-NN) algorithm which measures nearest neighbor distance and nearest neighbor parallelism. The global distributions of the features are measured using statistics of the proposed parameters. A linear support vector machine (SVM) algorithm is used to classify the high grade and low grade bladder cancers. The results show using a particular group of nuclei such as large ones, and combining multiple parameters can achieve better discrimination. This study shows the proposed approach can potentially help expedite pathological diagnosis by triaging potentially suspicious biopsies.

  17. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    Science.gov (United States)

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  18. The health impact of trade and investment agreements: a quantitative systematic review and network co-citation analysis.

    Science.gov (United States)

    Barlow, Pepita; McKee, Martin; Basu, Sanjay; Stuckler, David

    2017-03-08

    Regional trade agreements are major international policy instruments that shape macro-economic and political systems. There is widespread debate as to whether and how these agreements pose risks to public health. Here we perform a comprehensive systematic review of quantitative studies of the health impact of trade and investment agreements. We identified studies from searches in PubMed, Web of Science, EMBASE, and Global Health Online. Research articles were eligible for inclusion if they were quantitative studies of the health impacts of trade and investment agreements or policy. We systematically reviewed study findings, evaluated quality using the Quality Assessment Tool from the Effective Public Health Practice Project, and performed network citation analysis to study disciplinary siloes. Seventeen quantitative studies met our inclusion criteria. There was consistent evidence that implementing trade agreements was associated with increased consumption of processed foods and sugar-sweetened beverages. Granting import licenses for patented drugs was associated with increased access to pharmaceuticals. Implementing trade agreements and associated policies was also correlated with higher cardiovascular disease incidence and higher Body Mass Index (BMI), whilst correlations with tobacco consumption, under-five mortality, maternal mortality, and life expectancy were inconclusive. Overall, the quality of studies is weak or moderately weak, and co-citation analysis revealed a relative isolation of public health from economics. We identified limitations in existing studies which preclude definitive conclusions of the health impacts of regional trade and investment agreements. Few address unobserved confounding, and many possible consequences and mechanisms linking trade and investment agreements to health remain poorly understood. Results from our co-citation analysis suggest scope for greater interdisciplinary collaboration. Notwithstanding these limitations, our

  19. Quantitative proteomic analysis of ibuprofen-degrading Patulibacter sp. strain I11

    DEFF Research Database (Denmark)

    Almeida, Barbara; Kjeldal, Henrik; Lolas, Ihab Bishara Yousef

    2013-01-01

    was identified and quantified by gel based shotgun-proteomics. In total 251 unique proteins were quantitated using this approach. Biological process and pathway analysis indicated a number of proteins that were up-regulated in response to active degradation of ibuprofen, some of them are known to be involved...... in the degradation of aromatic compounds. Data analysis revealed that several of these proteins are likely involved in ibuprofen degradation by Patulibacter sp. strain I11.......Ibuprofen is the third most consumed pharmaceutical drug in the world. Several isolates have been shown to degrade ibuprofen, but very little is known about the biochemistry of this process. This study investigates the degradation of ibuprofen by Patulibacter sp. strain I11 by quantitative...

  20. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  1. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    International Nuclear Information System (INIS)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H.

    1998-01-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size≥0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions

  2. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H. [Gachon Medical College, Gil Medical Center, Inchon (Korea, Republic of)

    1998-07-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size{>=}0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions.

  3. Contribution of the surface contamination of uranium-materials on the quantitative analysis results by electron probe microbeam analysis

    International Nuclear Information System (INIS)

    Bonino, O.; Fournier, C.; Fucili, C.; Dugne, O.; Merlet, C.

    2000-01-01

    The analytical testing of uranium materials is necessary for quality research and development in nuclear industry applications (enrichment, safety studies, fuel, etc). Electron Probe Microbeam Analysis Wavelength Dispersive Spectrometry (EPMA-WDS) is a dependable non-destructive analytical technology. The characteristic X-ray signal is measured to identify and quantify the sample components, and the analyzed volume is about one micron cube. The surface contamination of uranium materials modifies and contributes to the quantitative analysis results of EPMA-WDS. This contribution is not representative of the bulk. A thin oxidized layer appears in the first instants after preparation (burnishing, cleaning) as well as a carbon contamination layer, due to metallographic preparation and carbon cracking under the impact of the electron probe. Several analytical difficulties subsequently arise, including an overlapping line between the carbon Ka ray and the Uranium U NIVOVI ray. Sensitivity and accuracy of the quantification of light elements like carbon and oxygen are also reduced by the presence of uranium. The aim of this study was to improve the accuracy of quantitative analysis on uranium materials by EPMA-WDS by taking account of the contribution of surface contamination. The first part of this paper is devoted to the study of the contaminated surface of the uranium materials U, UFe 2 and U 6 Fe a few hours after preparation. These oxidation conditions are selected so as to reproduce the same contamination surfaces occurring in microprobe analytical conditions. Surface characterization techniques were SIMS and Auger spectroscopy. The contaminated surfaces are shown. They consist of successive layers: a carbon layer, an oxidized iron layer, followed by an iron depletion layer (only in UFe 2 and U 6 Fe), and a ternary oxide layer (U-Fe-O for UFe 2 et U 6 Fe and UO 2+x for uranium). The second part of the paper addresses the estimation of the errors in quantitative

  4. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  5. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  6. PERFORMANCE INDICATORS: A COMPARATIVE ANALYSIS BETWEEN PUBLIC AND PRIVATE COLLEGES IN BRAZIL

    Directory of Open Access Journals (Sweden)

    Átila de Melo Lira

    2015-06-01

    Full Text Available A comparative analysis between the use of performance indicators to public and private organizations have always been required to examine the scenario related to both. This study seeks to analyze the use of Balanced Scorecard (BSC to identify and understand the main differences and similarities in public and private higher education institutions (HEIs in Brazil in relation to the use of other organizations performance indicators. A quantitative and exploratory approach was adopted using institutional documents analysis. Data was searched on the websites of Brazilian higher education public and private organizations in order to accomplish this analysis comparative. The results showed that even reviewing few public institutions the use of performance indicators appears to be more efficient than those applied to the private ones. Private universities should observe and improve their processes and performance indicators based on those used in Brazilian public universities. This initial research still opens a horizon so that other studies be developed within this thought stream.

  7. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  8. [Quantitative analysis of the corneal subbasal nerves in different degrees of dry eye with AutoCAD].

    Science.gov (United States)

    Cheng, Y; Wu, J; Zhu, H F; Cheng, Y; Zhu, X P

    2016-03-01

    To evaluate the practical value of AutoCAD in quantitative analysis of corneal subbasal epithelial nerves with different degrees of dry eye. Ninety patients were divided into groups of mild, moderate, and severe dry eye, 30 patients (60 eyes) in each group. And 30 healthy volunteers were recruited as the normal control group. Confocal microscopy was used to observe the length of the subbasal epithelial nerve plexus. The images were analyzed by AutoCAD software to determine the density (mm/mm(2)), the number of branches, and the curvature score of the subbasal epithelial nerves. These data of patients with dry eye and the controls were statistically compared, by analysis of variance(ANOV). By AutoCAD software, quantitative analysis of the corneal subbasal epithelial nerves was successfully performed. The nerve density in the patients with mild dry eye[(16.70±3.43) mm/mm(2)] was not significantly different from the controls[(15.87 ± 2.75) mm/mm(2)] (P=0.880), but the number of nerval branches 13.43±2.46 and the curvature 3.10±0.80 increased significantly (Pdry eye was significantly different from that in the normal control group (F=114.739, Pdry eye than the controls, but there was no significant difference in the curvature scores between the two groups (P= 0.557). AutoCAD software is useful in the quantitative analysis of corneal nerve images under a confocal microscope. The corneal subbasal epithelial nerve density, the number of branches, and the curvature of the nerves are related to the degree of dry eye, and may be used as clinical indicators.

  9. Optimal climate policy is a utopia. From quantitative to qualitative cost-benefit analysis

    International Nuclear Information System (INIS)

    Van den Bergh, Jeroen C.J.M.

    2004-01-01

    The dominance of quantitative cost-benefit analysis (CBA) and optimality concepts in the economic analysis of climate policy is criticised. Among others, it is argued to be based in a misplaced interpretation of policy for a complex climate-economy system as being analogous to individual inter-temporal welfare optimisation. The transfer of quantitative CBA and optimality concepts reflects an overly ambitious approach that does more harm than good. An alternative approach is to focus the attention on extreme events, structural change and complexity. It is argued that a qualitative rather than a quantitative CBA that takes account of these aspects can support the adoption of a minimax regret approach or precautionary principle in climate policy. This means: implement stringent GHG reduction policies as soon as possible

  10. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  11. The Impact of Situation-Based Learning to Students’ Quantitative Literacy

    Science.gov (United States)

    Latifah, T.; Cahya, E.; Suhendra

    2017-09-01

    Nowadays, the usage of quantities can be seen almost everywhere. There has been an increase of quantitative thinking, such as quantitative reasoning and quantitative literacy, within the context of daily life. However, many people today are still not fully equipped with the knowledge of quantitative thinking. There are still a lot of individuals not having enough quantitative skills to perform well within today’s society. Based on this issue, the research aims to improve students’ quantitative literacy in junior high school. The qualitative analysis of written student work and video observations during the experiment reveal that the impact of situation-based learning affects students’ quantitative literacy.

  12. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  13. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  14. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  15. Qualitative and quantitative analysis of an additive element in metal oxide nanometer film using laser induced breakdown spectroscopy.

    Science.gov (United States)

    Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili

    2018-01-20

    The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89  mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.

  16. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  17. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  18. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    International Nuclear Information System (INIS)

    Fernandez-Ruiz, R.; Garcia-Heras, M.

    2008-01-01

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies

  19. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-Ruiz, R. [Servicio Interdepartamental de Investigacion, Facultad de Ciencias, Universidad Autonoma de Madrid, Modulo C-9, Laboratorio de TXRF, Crta. Colmenar, Km 15, Cantoblanco, E-28049, Madrid (Spain)], E-mail: ramon.fernandez@uam.es; Garcia-Heras, M. [Grupo de Arqueometria de Vidrios y Materiales Ceramicos, Instituto de Historia, Centro de Ciencias Humanas y Sociales, CSIC, C/ Albasanz, 26-28, 28037 Madrid (Spain)

    2008-09-15

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies.

  20. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  1. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  2. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  3. Analysis of hepcidin expression: in situ hybridization and quantitative polymerase chain reaction from paraffin sections.

    Science.gov (United States)

    Sakuraoka, Yuhki; Sawada, Tokihiko; Shiraki, Takayuki; Park, Kyunghwa; Sakurai, Yuhichiro; Tomosugi, Naohisa; Kubota, Keiichi

    2012-07-28

    To establish methods for quantitative polymerase chain reaction (PCR) for hepcidin using RNAs isolated from paraffin-embedded sections and in situ hybridization of hepatocellular carcinoma (HCC). Total RNA from paraffin-embedded sections was isolated from 68 paraffin-embedded samples of HCC. Samples came from 54 male and 14 female patients with a mean age of 66.8 ± 7.8 years. Quantitative PCR was performed. Immunohistochemistry and in situ hybridization for hepcidin were also performed. Quantitative PCR for hepcidin using RNAs isolated from paraffin-embedded sections of HCC was performed successfully. The expression level of hepcidin mRNA in cancer tissues was significantly higher than that in non-cancer tissues. A method of in situ hybridization for hepcidin was established successfully, and this demonstrated that hepcidin mRNA was expressed in non-cancerous tissue but absent in cancerous tissue. We have established novel methods for quantitative PCR for hepcidin using RNAs isolated from paraffin-embedded sections and in situ hybridization of HCC.

  4. Quantitative assessment of 201TlCl myocardial SPECT

    International Nuclear Information System (INIS)

    Uehara, Toshiisa

    1987-01-01

    Clinical evaluation of the quantitative analysis of Tl-201 myocardial tomography by SPECT (Single Photon Emission Computed Tomography) was performed in comparison with visual evaluation. The method of quantitative analysis has been already reported in our previous paper. In this study, the program of re-standardization in the case of lateral myocardial infarction was added. This program was useful mainly for the evaluation of lesions in the left circumflex coronary artery. Regarding the degree of diagnostic accuracy of myocardial infarction in general, quantitative evaluation of myocardial SPECT images was highest followed by visual evaluation of myocardial SPECT images, and visual evaluation of myocardial planar images. However, in the case of anterior myocardial infarction, visual evaluation of myocardial SPECT images has almost the same detectability as quantitative evaluation of myocardial SPECT images. In the case of infero-posterior myocardial infarction, quantitative evaluation was superior to visual evaluation. As for specificity, quantitative evaluation of SPECT images was slightly inferior to visual evaluation of SPECT images. An infarction map was made by quantitative analysis and this enabled us to determine the infarction site, extent and degree according to easily recognizable patterns. As a result, the responsible coronary artery lesion could be inferred correctly and the calculated infarction score could be correlated with the residual left ventricular function after myocardial infarction. (author)

  5. A framework for the quantitative assessment of performance-based system resilience

    International Nuclear Information System (INIS)

    Tran, Huy T.; Balchanos, Michael; Domerçant, Jean Charles; Mavris, Dimitri N.

    2017-01-01

    Increasing system complexity and threat uncertainty require the consideration of resilience in the design and analysis of engineered systems. While the resilience engineering community has begun to converge on a definition and set of characteristics for resilience, methods for quantifying the concept are still limited in their applicability to system designers. This paper proposes a framework for assessing resilience that focuses on the ability of a system to absorb disruptions, recover from them, and adapt over time. The framework extends current approaches by explicitly considering temporal aspects of system responses to disruptions, volatility in system performance data, and the possibility of multiple disruption events. Notional system performance data is generated using the logistic function, providing an experimental platform for a parametric comparison of the proposed resilience metric with an integration-based metric. An information exchange network model is used to demonstrate the applicability of the framework towards system design tradeoff studies using stochastic simulations. The presented framework is domain-agnostic and flexible, such that it can be applied to a variety of systems and adjusted to focus on specific aspects of resilience. - Highlights: • We propose a quantitative framework and metrics for assessing system resilience. • Metrics focus on absorption, recovery, and adaptation to disruptions. • The framework accepts volatile data and is easily automated for simulation studies. • The framework is applied to a model of adaptive information exchange networks. • Results show benefits of network adaptation against random and targeted threats.

  6. Interlaboratory validation of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Takabatake, Reona; Koiwa, Tomohiro; Kasahara, Masaki; Takashima, Kaori; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Oguchi, Taichi; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    To reduce the cost and time required to routinely perform the genetically modified organism (GMO) test, we developed a duplex quantitative real-time PCR method for a screening analysis simultaneously targeting an event-specific segment for GA21 and Cauliflower Mosaic Virus 35S promoter (P35S) segment [Oguchi et al., J. Food Hyg. Soc. Japan, 50, 117-125 (2009)]. To confirm the validity of the method, an interlaboratory collaborative study was conducted. In the collaborative study, conversion factors (Cfs), which are required to calculate the GMO amount (%), were first determined for two real-time PCR instruments, the ABI PRISM 7900HT and the ABI PRISM 7500. A blind test was then conducted. The limit of quantitation for both GA21 and P35S was estimated to be 0.5% or less. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)). The determined bias and RSD(R) were each less than 25%. We believe the developed method would be useful for the practical screening analysis of GM maize.

  7. Quantitative evaluation of regional blood flow in pulmonary sarcoidosis with Bull's eye analysis

    International Nuclear Information System (INIS)

    Akaki, Shiro

    1991-01-01

    Lung perfusion scintigraphy was performed in 23 patients with pulmonary sarcoidosis and in 11 normal volunteers. Bull's eye analysis was used to analyze regional pulmonary blood flow quantitatively. First, whole lung perfusion images were divided into three regions by three concentric circles. Then radial axes were projected from the center to define 36 x 10deg sectors. The counts for each sector were calculated and a Bull's eye image was displayed. The counts were compared with the lower limit of normal (mean -2SD), and as the indices of reduction in perfusion, extent score (ES) and severity score (SS) were calculated. ES and SS showed significant reduction in perfusion 16 patients (70%) with sarcoidosis. In stage II sarcoidosis, both ES and SS were significantly higher than in stage I sarcoidosis (p 67 Ga scintigraphy findings. In comparison with clinical data, ES had a positive correlation with serum angiotensin-converting enzyme activity (p + /CD8 + ratio (p<0.05). The Bull's eye analysis was considered useful for the quantitative evaluation of regional pulmonary blood flow in pulmonary sarcoidosis, and it was suggested that the mechanism of reduction in perfusion might result mainly in its alveolitis and angitis. Ventilation abnormality, which may happen prior to reduction in perfusion, may be an important factor of reduction in perfusion. (author)

  8. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    International Nuclear Information System (INIS)

    Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing

    2015-01-01

    Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC

  9. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    Science.gov (United States)

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography

    International Nuclear Information System (INIS)

    Tappin, Marcelo R.R.; Pereira, Jislaine F.G.; Lima, Lucilene A.; Siani, Antonio C.; Mazzei, Jose L.; Ramos, Monica F.S.

    2004-01-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  12. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  13. Optimization of Region of Interest Drawing for Quantitative Analysis: Differentiation Between Benign and Malignant Breast Lesions on Contrast-Enhanced Sonography.

    Science.gov (United States)

    Nakata, Norio; Ohta, Tomoyuki; Nishioka, Makiko; Takeyama, Hiroshi; Toriumi, Yasuo; Kato, Kumiko; Nogi, Hiroko; Kamio, Makiko; Fukuda, Kunihiko

    2015-11-01

    This study was performed to evaluate the diagnostic utility of quantitative analysis of benign and malignant breast lesions using contrast-enhanced sonography. Contrast-enhanced sonography using the perflubutane-based contrast agent Sonazoid (Daiichi Sankyo, Tokyo, Japan) was performed in 94 pathologically proven palpable breast mass lesions, which could be depicted with B-mode sonography. Quantitative analyses using the time-intensity curve on contrast-enhanced sonography were performed in 5 region of interest (ROI) types (manually traced ROI and circular ROIs of 5, 10, 15, and 20 mm in diameter). The peak signal intensity, initial slope, time to peak, positive enhancement integral, and wash-out ratio were investigated in each ROI. There were significant differences between benign and malignant lesions in the time to peak (P benign and malignant lesions in the time to peak (P benign and malignant breast lesions. © 2015 by the American Institute of Ultrasound in Medicine.

  14. Novel quantitative autophagy analysis by organelle flow cytometry after cell sonication.

    Directory of Open Access Journals (Sweden)

    Michael Degtyarev

    Full Text Available Autophagy is a dynamic process of bulk degradation of cellular proteins and organelles in lysosomes. Current methods of autophagy measurement include microscopy-based counting of autophagic vacuoles (AVs in cells. We have developed a novel method to quantitatively analyze individual AVs using flow cytometry. This method, OFACS (organelle flow after cell sonication, takes advantage of efficient cell disruption with a brief sonication, generating cell homogenates with fluorescently labeled AVs that retain their integrity as confirmed with light and electron microscopy analysis. These AVs could be detected directly in the sonicated cell homogenates on a flow cytometer as a distinct population of expected organelle size on a cytometry plot. Treatment of cells with inhibitors of autophagic flux, such as chloroquine or lysosomal protease inhibitors, increased the number of particles in this population under autophagy inducing conditions, while inhibition of autophagy induction with 3-methyladenine or knockdown of ATG proteins prevented this accumulation. This assay can be easily performed in a high-throughput format and opens up previously unexplored avenues for autophagy analysis.

  15. A meta-analysis of effects of post-hatch food and water deprivation on development, performance and welfare of chickens

    NARCIS (Netherlands)

    Jong, de I.C.; Riel, van J.W.; Bracke, M.B.M.; Brand, van den H.

    2017-01-01

    A ‘meta-analysis’ was performed to determine effects of post-hatch food and water deprivation (PHFWD) on chicken development, performance and welfare (including health). Two types of meta-analysis were performed on peer-reviewed scientific publications: a quantitative ‘meta-analysis’ (MA) and a

  16. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    Science.gov (United States)

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  17. Pulmonary nodule characterization: A comparison of conventional with quantitative and visual semi-quantitative analyses using contrast enhancement maps

    International Nuclear Information System (INIS)

    Petkovska, Iva; Shah, Sumit K.; McNitt-Gray, Michael F.; Goldin, Jonathan G.; Brown, Matthew S.; Kim, Hyun J.; Brown, Kathleen; Aberle, Denise R.

    2006-01-01

    Purpose: To determine whether conventional nodule densitometry or analysis based on contrast enhancement maps of indeterminate lung nodules imaged with contrast-enhanced CT can distinguish benign from malignant lung nodules. Materials and method: Thin section, contrast-enhanced CT (baseline, and post-contrast series acquired at 45, 90,180, and 360 s) was performed on 29 patients with indeterminate lung nodules (14 benign, 15 malignant). A thoracic radiologist identified the boundary of each nodule using semi-automated contouring to form a 3D region-of-interest (ROI) on each image series. The post-contrast series having the maximum mean enhancement was then volumetrically registered to the baseline series. The two series were subtracted volumetrically and the subtracted voxels were quantized into seven color-coded bins, forming a contrast enhancement map (CEM). Conventional nodule densitometry was performed to obtain the maximum difference in mean enhancement values for each nodule from a circular ROI. Three thoracic radiologists performed visual semi-quantitative analysis of each nodule, scoring each map for: (a) magnitude and (b) heterogeneity of enhancement throughout the entire volume of the nodule on a five-point scale. Receiver operator characteristic (ROC) analysis was conducted on these features to evaluate their diagnostic efficacy. Finally, 14 quantitative texture features were calculated for each map. A statistical analysis was performed to combine the 14 texture features to a single factor. ROC analysis of the derived aggregate factor was done as an indicator of malignancy. All features were analyzed for differences between benign and malignant nodules. Results: Using 15 HU as a threshold, 93% (14/15) of malignant and 79% (11/14) of benign nodules demonstrated enhancement. The ROC curve when higher values of enhancement indicate malignancy was generated and area under the curve (AUC) was 0.76. The visually scored magnitude of enhancement was found to be

  18. Quantitative Analysis of Radionuclide for the Used Resin of the Primary Purification System in HANARO

    International Nuclear Information System (INIS)

    Lee, Mun; Kim, Myong Seop; Park, Se Il; Kim, Tae Whan; Kim, Dong Hun; Kim, Young Chil

    2005-01-01

    In HANARO, a 30 MW research reactor, the ion exchange resin has been used for the purification of the primary coolant system. The resin used in the primary coolant purification system is replaced with new one once every 3 months during 30 MW reactor operation. The extracted resin from the primary coolant purification system is temporarily stored in a shielding treatment of the reactor hall for radiation cooling. After the radiation level of resin decreases enough to be handled for the waste disposal, it is put into the waste drum, and delivered to the waste facility in KAERI. Recently, in this procedure, the quantitative analysis of radionuclide which is contained in resin is required to have more quantitative data for the disposal. Therefore, in this work, a preliminary study was performed to find a sampling method for the representation of the characteristics of radionuclide in the spent resin

  19. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-01-01

    -friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface...... such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical...... displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics...

  20. Quantitative evaluation of bone scintigraphy in prostate cancer

    International Nuclear Information System (INIS)

    Yamamoto, Yasushi

    2017-01-01

    This paper described the quantitative evaluation of bone scintigraphy that is used in the inspection of the bone-metastasis of prostate cancer. In advanced prostate cancer, bone scintigraphic examination with technetium 99m methylenediphosphonate (complex compound) is indispensable. Since bone metastasis hardly involves soft tissue, the morphological evaluation of soft tissue cancer cannot be used as a reference. Therefore, quantitative evaluation peculiar to bone scintigraphy has been developed. Following the visual evaluation that began in the 1980's, a technique considering highly integrated parts and areas of images was proposed in the 1990's. The computer-aided diagnosis (CAD) software that automated the manual analysis of the above technique was developed in the 2010's. In order to evaluate the usefulness of quantitative evaluation based on bone CAD, the authors performed bone scintigraphy for 42 patients, who were diagnosed as castration-resistant prostate cancer (CRPC) in 2004 to 2011 and received DEC therapy for 4 months. When bone CAD analysis was performed, it was found that the therapeutic effect could not be determined earlier than the judgement using the increase of PSA antigen. Recently quantitative analysis shifted from bone scintigraphy to bone SPECT (single photon emission computed tomography), and papers have also been published since the 2010s. In bone SPECT, the quantitative function of SUV (standardized uptake value) was equipped, and in the clinical use case of SUV, SUV increase was seen earlier than the increase of PSA antigen. The evidences are expected to be accumulated in the future. (A.O.)

  1. Brain wave correlates of attentional states: Event related potentials and quantitative EEG analysis during performance of cognitive and perceptual tasks

    Science.gov (United States)

    Freeman, Frederick G.

    1993-01-01

    presented target stimulus. In addition to the task requirements, irrelevant tones were presented in the background. Research has shown that even though these stimuli are not attended, ERP's to them can still be elicited. The amplitude of the ERP waves has been shown to change as a function of a person's level of alertness. ERP's were also collected and analyzed for the target stimuli for each task. Brain maps were produced based on the ERP voltages for the different stimuli. In addition to the ERP's, a quantitative EEG (QEEG) was performed on the data using a fast Fourier technique to produce a power spectral analysis of the EEG. This analysis was conducted on the continuous EEG while the subjects were performing the tasks. Finally, a QEEG was performed on periods during the task when subjects indicated that they were in an altered state of awareness. During the tasks, subjects were asked to indicate by pressing a button when they realized their level of task awareness had changed. EEG epochs were collected for times just before and just after subjects made this reponse. The purpose of this final analysis was to determine whether or not subjective indices of level of awareness could be correlated with different patterns of EEG.

  2. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  3. Calibration Phantom for Quantitative Tomography Analysis of Biodistribution of Magnetic Nanoparticles

    Science.gov (United States)

    Rahn, Helen; Kettering, Melanie; Richter, Heike; Hilger, Ingrid; Trahms, Lutz; Odenbach, Stefan

    2010-12-01

    Ferrofluids are being investigated for cancer treatments such as magnetic drug targeting (MDT) and magnetic heating treatments with the aim of treating the cancer locally, since magnetic nanoparticles with attached drugs are concentrated within the target region. Thus, the side effects are considerably reduced. One of the crucial factors for the success of these therapies is the magnetic nanoparticle distribution. Microcomputed X-ray tomography (XμCT) has been introduced as adequate technique for non-destructive three-dimensional analysis of biological samples enriched with magnetic nanoparticles. The biological tissue specimens, in this case tumor bearing mice after intra-tumoral magnetic nanoparticle injection, have been analyzed by means of XμCT. Complementary measurements have been performed by magnetorelaxometry (MRX). This technique enables a sensitive quantification of magnetic nanoparticles down to few nanograms. For multi-phase samples, such as biological tissue enriched with magnetic nanoparticles the polychromasy and beam hardening artifacts occurring in XμCT with conventional X-ray tubes cause severe problems for quantitative density determination. This problem requires an appropriate calibration of the polychromatic tomography equipment enabling a semi-quantitative analysis of the data. For this purpose a phantom system has been implemented. These phantoms consist of a tissue substitute containing different amounts of magnetic nanoparticles. Since the attenuation of the beam also depends on the thickness i.e. the path length of the beam transmitting the object, the reference sample has been defined to a cone shape. Thus, with one phantom the information about the magnetic nanoparticle concentration as well as the attenuation in dependence of the path length can be determined. Two phantom systems will be presented, one based on agarose-gel and one based on soap.

  4. Limitations for qualitative and quantitative neutron activation analysis using reactor neutrons

    International Nuclear Information System (INIS)

    El-Abbady, W.H.; El-Tanahy, Z.H.; El-Hagg, A.A.; Hassan, A.M.

    1999-01-01

    In this work, the most important limitations for qualitative and quantitative analysis using reactor neutrons for activation are reviewed. Each limitation is discussed using different examples of activated samples. Photopeak estimation, nuclear reactions interference and neutron flux measurements are taken into consideration. Solutions for high accuracy evaluation in neutron activation analysis applications are given. (author)

  5. Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.

    Science.gov (United States)

    Romeo, Elizabeth M

    2010-07-01

    The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena. Copyright 2010, SLACK Incorporated.

  6. Quantitative aspects of the clinical performance of transverse tripolar spinal cord stimulation.

    Science.gov (United States)

    Wesselink, W A; Holsheimer, J; King, G W; Torgerson, N A; Boom, H B

    1999-01-01

    A multicenter study was initiated to evaluate the performance of the transverse tripolar system for spinal cord stimulation. Computer modeling had predicted steering of paresthesia with a dual channel stimulator to be the main benefit of the system. The quantitative analysis presented here includes the results of 484 tests in 30 patients. For each test, paresthesia coverage as a function of voltage levels was stored in a computerized database, including a body map which enabled calculation of the degree of paresthesia coverage of separate body areas, as well as the overlap with the painful areas. The results show that with the transverse tripolar system steering of the paresthesia is possible, although optimal steering requires proper placement of the electrode with respect to the spinal cord. Therefore, with this steering ability as well as a larger therapeutic stimulation window as compared to conventional systems, we expect an increase of the long-term efficacy of spinal cord stimulation. Moreover, in view of the stimulation-induced paresthesia patterns, the system allows selective stimulation of the medial dorsal columns.

  7. Semi-quantitative evaluation of gallium-67 scintigraphy in lupus nephritis

    Energy Technology Data Exchange (ETDEWEB)

    Lin Wanyu [Dept. of Nuclear Medicine, Taichung Veterans General Hospital, Taichung (Taiwan); Dept. of Radiological Technology, Chung-Tai College of Medical Technology, Taichung (Taiwan); Hsieh Jihfang [Section of Nuclear Medicine, Chi-Mei Foundation Hospital, Yunk Kang City, Tainan (Taiwan); Tsai Shihchuan [Dept. of Nuclear Medicine, Show Chwan Memorial Hospital, Changhua (Taiwan); Lan Joungliang [Dept. of Internal Medicine, Taichung Veterans General Hospital, Taichung (Taiwan); Cheng Kaiyuan [Dept. of Radiological Technology, Chung-Tai College of Medical Technology, Taichung (Taiwan); Wang Shyhjen [Dept. of Nuclear Medicine, Taichung Veterans General Hospital, Taichung (Taiwan)

    2000-11-01

    Within nuclear medicine there is a trend towards quantitative analysis. Gallium renal scan has been reported to be useful in monitoring the disease activity of lupus nephritis. However, only visual interpretation using a four-grade scale has been performed in previous studies, and this method is not sensitive enough for follow-up. In this study, we developed a semi-quantitative method for gallium renal scintigraphy to find a potential parameter for the evaluation of lupus nephritis. Forty-eight patients with lupus nephritis underwent renal biopsy to determine World Health Organization classification, activity index (AI) and chronicity index (CI). A delayed 48-h gallium scan was also performed and interpreted by visual and semi-quantitative methods. For semi-quantitative analysis of the gallium uptake in both kidneys, regions of interest (ROIs) were drawn over both kidneys, the right forearm and the adjacent spine. The uptake ratios between these ROIs were calculated and expressed as the ''kidney/spine ratio (K/S ratio)'' or the ''kidney/arm ratio (K/A ratio)''. Spearman's rank correlation test and Mann-Whitney U test were used for statistical analysis. Our data showed a good correlation between the semi-quantitative gallium scan and the results of visual interpretation. K/S ratios showed a better correlation with AI than did K/A ratios. Furthermore, the left K/S ratio displayed a better correlation with AI than did the right K/S ratio. In contrast, CI did not correlate well with the results of semi-quantitative gallium scan. In conclusion, semi-quantitative gallium renal scan is easy to perform and shows a good correlation with the results of visual interpretation and renal biopsy. The left K/S ratio from semi-quantitative renal gallium scintigraphy displays the best correlation with AI and is a useful parameter in evaluating the disease activity in lupus nephritis. (orig.)

  8. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  9. A temperature-controlled photoelectrochemical cell for quantitative product analysis

    Science.gov (United States)

    Corson, Elizabeth R.; Creel, Erin B.; Kim, Youngsang; Urban, Jeffrey J.; Kostecki, Robert; McCloskey, Bryan D.

    2018-05-01

    In this study, we describe the design and operation of a temperature-controlled photoelectrochemical cell for analysis of gaseous and liquid products formed at an illuminated working electrode. This cell is specifically designed to quantitatively analyze photoelectrochemical processes that yield multiple gas and liquid products at low current densities and exhibit limiting reactant concentrations that prevent these processes from being studied in traditional single chamber electrolytic cells. The geometry of the cell presented in this paper enables front-illumination of the photoelectrode and maximizes the electrode surface area to electrolyte volume ratio to increase liquid product concentration and hence enhances ex situ spectroscopic sensitivity toward them. Gas is bubbled through the electrolyte in the working electrode chamber during operation to maintain a saturated reactant concentration and to continuously mix the electrolyte. Gaseous products are detected by an in-line gas chromatograph, and liquid products are analyzed ex situ by nuclear magnetic resonance. Cell performance was validated by examining carbon dioxide reduction on a silver foil electrode, showing comparable results both to those reported in the literature and identical experiments performed in a standard parallel-electrode electrochemical cell. To demonstrate a photoelectrochemical application of the cell, CO2 reduction experiments were carried out on a plasmonic nanostructured silver photocathode and showed different product distributions under dark and illuminated conditions.

  10. Simultaneous HPLC quantitative analysis of mangostin derivatives in Tetragonula pagdeni propolis extracts

    Directory of Open Access Journals (Sweden)

    Sumet Kongkiatpaiboon

    2016-04-01

    Full Text Available Propolis has been used as indigenous medicine for curing numerous maladies. The one that is of ethnopharmacological use is stingless bee propolis from Tetragonula pagdeni. A simultaneous high-performance liquid chromatography (HPLC investigation was developed and validated to determine the contents of bioactive compounds: 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin. HPLC analysis was effectively performed using a Hypersil BDS C18 column, with the gradient elution of methanol–0.2% formic acid and a flow rate of 1 ml/min, at 25 °C and detected at 245 nm. Parameters for the validation included accuracy, precision, linearity, and limits of quantitation and detection. The developed HPLC technique was precise, with lower than 2% relative standard deviation. The recovery values of 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin in the extracts were 99.98%, 99.97%, 98.98% and 99.19%, respectively. The average contents of these mixtures in the propolis extracts collected from different seasons were 0.127%, 1.008%, 0.323% and 2.703% (w/w, respectively. The developed HPLC technique was suitable and practical for the simultaneous analysis of these mangostin derivatives in T. pagdeni propolis and would be a valuable guidance for the standardization of its pharmaceutical products.

  11. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  12. Quantitative and temporal proteome analysis of butyrate-treated colorectal cancer cells.

    Science.gov (United States)

    Tan, Hwee Tong; Tan, Sandra; Lin, Qingsong; Lim, Teck Kwang; Hew, Choy Leong; Chung, Maxey C M

    2008-06-01

    Colorectal cancer is one of the most common cancers in developed countries, and its incidence is negatively associated with high dietary fiber intake. Butyrate, a short-chain fatty acid fermentation by-product of fiber induces cell maturation with the promotion of growth arrest, differentiation, and/or apoptosis of cancer cells. The stimulation of cell maturation by butyrate in colonic cancer cells follows a temporal progression from the early phase of growth arrest to the activation of apoptotic cascades. Previously we performed two-dimensional DIGE to identify differentially expressed proteins induced by 24-h butyrate treatment of HCT-116 colorectal cancer cells. Herein we used quantitative proteomics approaches using iTRAQ (isobaric tags for relative and absolute quantitation), a stable isotope labeling methodology that enables multiplexing of four samples, for a temporal study of HCT-116 cells treated with butyrate. In addition, cleavable ICAT, which selectively tags cysteine-containing proteins, was also used, and the results complemented those obtained from the iTRAQ strategy. Selected protein targets were validated by real time PCR and Western blotting. A model is proposed to illustrate our findings from this temporal analysis of the butyrate-responsive proteome that uncovered several integrated cellular processes and pathways involved in growth arrest, apoptosis, and metastasis. These signature clusters of butyrate-regulated pathways are potential targets for novel chemopreventive and therapeutic drugs for treatment of colorectal cancer.

  13. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  14. Integrative Analysis of Subcellular Quantitative Proteomics Studies Reveals Functional Cytoskeleton Membrane-Lipid Raft Interactions in Cancer.

    Science.gov (United States)

    Shah, Anup D; Inder, Kerry L; Shah, Alok K; Cristino, Alexandre S; McKie, Arthur B; Gabra, Hani; Davis, Melissa J; Hill, Michelle M

    2016-10-07

    Lipid rafts are dynamic membrane microdomains that orchestrate molecular interactions and are implicated in cancer development. To understand the functions of lipid rafts in cancer, we performed an integrated analysis of quantitative lipid raft proteomics data sets modeling progression in breast cancer, melanoma, and renal cell carcinoma. This analysis revealed that cancer development is associated with increased membrane raft-cytoskeleton interactions, with ∼40% of elevated lipid raft proteins being cytoskeletal components. Previous studies suggest a potential functional role for the raft-cytoskeleton in the action of the putative tumor suppressors PTRF/Cavin-1 and Merlin. To extend the observation, we examined lipid raft proteome modulation by an unrelated tumor suppressor opioid binding protein cell-adhesion molecule (OPCML) in ovarian cancer SKOV3 cells. In agreement with the other model systems, quantitative proteomics revealed that 39% of OPCML-depleted lipid raft proteins are cytoskeletal components, with microfilaments and intermediate filaments specifically down-regulated. Furthermore, protein-protein interaction network and simulation analysis showed significantly higher interactions among cancer raft proteins compared with general human raft proteins. Collectively, these results suggest increased cytoskeleton-mediated stabilization of lipid raft domains with greater molecular interactions as a common, functional, and reversible feature of cancer cells.

  15. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  16. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  17. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    Science.gov (United States)

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  18. Quantitative analysis for the determination of aluminum percentage and detonation performance of aluminized plastic bonded explosives by laser-induced breakdown spectroscopy

    Science.gov (United States)

    Rezaei, A. H.; Keshavarz, M. H.; Kavosh Tehrani, M.; Darbani, S. M. R.

    2018-06-01

    The aluminized plastic-bonded explosive (PBX) is a composite material in which solid explosive particles are dispersed in a polymer matrix, which includes three major components, i.e. polymeric binder, metal fuel (aluminum) and nitramine explosive. This work introduces a new method on the basis of the laser-induced breakdown spectroscopy (LIBS) technique in air and argon atmospheres to investigate the determination of aluminum content and detonation performance of aluminized PBXs. Plasma emissions of aluminized PBXs are recorded where atomic lines of Al, C and H as well as molecular bands of AlO and CN are identified. The experimental results demonstrate that a good discrimination and separation between the aluminized PBXs is possible using LIBS and principle component analysis, although they have similar atomic composition. Relative intensity of the AlO/Al is used to determine aluminum percentage of the aluminized PBXs. The obtained quantitative calibration curve using the relative intensity of the AlO/Al is better than the resulting calibration curve using only the intensity of Al. By using the LIBS method and the measured intensity ratio of CN/C, an Al content of 15% is found to be the optimum value in terms of velocity of detonation of the RDX/Al/HTPB standard samples.

  19. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    Science.gov (United States)

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology

  20. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    Science.gov (United States)

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  1. [Qualitative and quantitative diagnostic performance of 320-slice computed tomography for detecting coronary artery disease with respect to atherosclerotic plaque characteristics].

    Science.gov (United States)

    Li, Suhua; Liu, Jinlai; Peng, Long; Dong, Ruimin; Wu, Huilan; Wang, Chenlin; Ni, Qiongqiong; Luo, Yanting; Zhu, Jieming; Chen, Lin

    2014-10-28

    To investigate qualitatively and quantitatively the diagnostic performance of 320-slice CT for detection of coronary artery disease with respect to different atherosclerotic plaque characteristics. A retrospective search was performed for inpatients underwent both coronary CT and further coronary angiography (CAG) from December 1, 2008 to December 31, 2012. The diagnostic performance of 320-slice CTA for detecting significant stenosis ( ≥ 50% diameter) with respect to atherosclerotic plaque characteristics were analyzed by calculating sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), accuracy, kappa index (κ), and area under the receiver operating characteristic curve (AUC). Chi-square test was used to evaluate whether there were significant differences of the true-case frequency (true positive + true negative) and false-case frequency (false positive + false negative) among groups. Bland-Altman analysis was used to determine limits of agreement between CTA and CAG. A total of 454 patients and 6 779 segments were analyzed. Diagnostic accuracy was higher in non-calcified segments; whereas they decreased in the presence of both mild-moderately and heavily calcified plaques. Excellent agreement (κ = 0.810) between CT and CAG was observed for non-calcified segments, while good agreement was observed for both mild-moderately (κ = 0.701) and heavily calcified segments (κ = 0.750). Both mild-moderate (P = 0.000) and heavy (P = 0.000) calcification decreased the true-case frequency and increased the false-case frequency when compared to non-calcification. There were no significant underestimation or overestimation for non-calcified (P = 0.087) and mild-moderately calcified (P = 0.704) segments, while there was significant overestimation for heavily calcified segments (P = 0.001). Great qualitative and quantitative diagnostic performances of 320-slice CT were observed in non-calcified coronary segments. However, qualitative

  2. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  3. Potential application of microfocus X-ray techniques for quantitative analysis of bone structure

    International Nuclear Information System (INIS)

    Takahashi, Kenta

    2006-01-01

    With the progress of micro-focused X-ray computed tomography (micro-CT), it has become possible to evaluate the bone structure quantitatively and three-dimensionally. The advantages of micro-CT are that sample preparations are not required and that it provides not only two-dimensional parameters but also three-dimensional stereological indices. This study was carried out to evaluate the potential application of the micro-focus X-ray techniques for quantitative analysis of the new bone produced inside of a hollow chamber of the experimental titanium miniature implant. Twenty-five male wistar rats (9-weeks of age) received experimental titanium miniature implant that had a hollow chamber inside in the left side of the femur. The rats were sacrificed, then the femurs were excised at 4 weeks or 8 weeks after implantation. Micro-CT analysis was performed on the femur samples and the volume of the new bone induced in the hollow chamber of implant was calculated. Percentages of new bone area on the undecalcified histological slides were also measured, linear regression analysis was carried out. In order to evaluate the correlation between pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. New bone formation occurred in experimental titanium miniature implant with a hollow chamber. The volume of new bone was measured by micro CT and the area percentage of new bone area against hollow chamber was calculated on the undecalcified slide. Linear regression analysis showed a high correlation between the pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. Consequently, the new bone produced inside of the hollow chamber of the experimental titanium miniature implant could be quantified as three-dimensional stereological by micro-CT and its precision was supported by the high correlation between the measurement by micro-CT and conservative two-dimensional measurement of histological slide. (author)

  4. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai; Tian, Yuan; Liu, Tao; Thomas, Stefani N.; Chen, Li; Schnaubelt, Michael; Boja, Emily; Hiltket, Tara; Kinsinger, Christopher; Rodriguez, Henry; Davies, Sherri; Li, Shunqiang; Snider, Jacqueline E.; Erdmann-Gilmore, Petra; Tabb, David L.; Townsend, Reid; Ellis, Matthew; Rodland, Karin D.; Smith, Richard D.; Carr, Steven A.; Zhang, Zhen; Chan, Daniel W.; Zhang, Hui

    2017-09-21

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assess the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.

  5. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    Science.gov (United States)

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  6. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  7. Sigma metric analysis for performance of creatinine with fresh frozen serum.

    Science.gov (United States)

    Kang, Fengfeng; Zhang, Chuanbao; Wang, Wei; Wang, Zhiguo

    2016-01-01

    Six sigma provides an objective and quantitative methodology to describe the laboratory testing performance. In this study, we conducted a national trueness verification scheme with fresh frozen serum (FFS) for serum creatinine to evaluate its performance in China. Two different concentration levels of FFS, targeted with reference method, were sent to 98 laboratories in China. Imprecision and bias of the measurement procedure were calculated for each participant to further evaluate the sigma value. Quality goal index (QGI) analysis was used to investigate the reason of unacceptable performance for laboratories with σ high concentration of creatinine had preferable sigma values. For the enzymatic method, 7.0% (5/71) to 45.1% (32/71) of the laboratories need to improve their measurement procedures (σ 1.2). Only 3.1-5.3% of the laboratories should improve both of the precision and trueness. Sigma metric analysis of the serum creatinine assays is disappointing, which was mainly due to the unacceptable analytical bias according to the QGI analysis. Further effort is needed to enhance the trueness of the creatinine measurement.

  8. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  9. Quantitative analysis of semivolatile organic compounds in selected fractions of air sample extracts by GC/MI-IR spectrometry

    International Nuclear Information System (INIS)

    Childers, J.W.; Wilson, N.K.; Barbour, R.K.

    1990-01-01

    The authors are currently investigating the capabilities of gas chromatography/matrix isolation infrared (GC/MI-IR) spectrometry for the determination of semivolatile organic compounds (SVOCs) in environmental air sample extracts. Their efforts are focused on the determination of SVOCs such as alkylbenzene positional isomers, which are difficult to separate chromatographically and to distinguish by conventional electron-impact ionization GC/mass spectrometry. They have performed a series of systematic experiments to identify sources of error in quantitative GC/MI-IR analyses. These experiments were designed to distinguish between errors due to instrument design or performance and errors that arise from some characteristic inherent to the GC/MI-IR technique, such as matrix effects. They have investigated repeatability as a function of several aspects of GC/MI IR spectrometry, including sample injection, spectral acquisition, cryogenic disk movement, and matrix deposition. The precision, linearity, dynamic range, and detection limits of a commercial GC/MI-IR system for target SVOCs were determined and compared to those obtained with the system's flame ionization detector. The use of deuterated internal standards in the quantitative GC/MI-IR analysis of selected fractions of ambient air sample extracts will be demonstrated. They will also discuss the current limitations of the technique in quantitative analyses and suggest improvements for future consideration

  10. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  11. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  12. A meta-analysis of the relationship of academic performance and Social Network Site use among adolescents and young adults

    NARCIS (Netherlands)

    Lui, Dong; Kirschner, Paul A.; Karpinski, Aryn

    2018-01-01

    This meta-analysis explores the relationship between SNS-use and academic performance. Examination of the literature containing quantitative measurements of both SNS-use and academic performance produced a sample of 28 effects sizes (N ¼ 101,441) for review. Results indicated a significant

  13. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    Science.gov (United States)

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Quantitative analysis of 18F-NaF dynamic PET/CT cannot differentiate malignant from benign lesions in multiple myeloma.

    Science.gov (United States)

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-01-01

    A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18 F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18 F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18 F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18 F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUV average for MM lesions was 11.9 and mean SUV max was 23.2. Respectively, SUV average and SUV max for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18 F-NaF revealed the following mean values for MM lesions: K 1 = 0.248 (1/min), k 3 = 0.359 (1/min), influx (K i ) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K 1 = 0.169 (1/min), k 3 = 0.422 (1/min), influx (K i ) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUV average , SUV max , K 1 , k 3 and influx (K i ) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18 F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18 F-NaF PET/CT in the diagnostic workup of MM.

  15. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A., E-mail: clayton.pereira.silva@usp.b [Instituto de Pesquisas Energeticas e Nucleares (CQMA/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2011-07-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50{mu}g. g{sup -1} and uncertainty was shorter than 10% for the determined elements. (author)

  16. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    International Nuclear Information System (INIS)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A.

    2011-01-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50μg. g -1 and uncertainty was shorter than 10% for the determined elements. (author)

  17. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    Science.gov (United States)

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  19. Time-Resolved Fluorescent Immunochromatography of Aflatoxin B1 in Soybean Sauce: A Rapid and Sensitive Quantitative Analysis.

    Science.gov (United States)

    Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen

    2016-07-14

    Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.

  20. Analysis of PET hypoxia imaging in the quantitative imaging for personalized cancer medicine program

    International Nuclear Information System (INIS)

    Yeung, Ivan; Driscoll, Brandon; Keller, Harald; Shek, Tina; Jaffray, David; Hedley, David

    2014-01-01

    Quantitative imaging is an important tool in clinical trials of testing novel agents and strategies for cancer treatment. The Quantitative Imaging Personalized Cancer Medicine Program (QIPCM) provides clinicians and researchers participating in multi-center clinical trials with a central repository for their imaging data. In addition, a set of tools provide standards of practice (SOP) in end-to-end quality assurance of scanners and image analysis. The four components for data archiving and analysis are the Clinical Trials Patient Database, the Clinical Trials PACS, the data analysis engine(s) and the high-speed networks that connect them. The program provides a suite of software which is able to perform RECIST, dynamic MRI, CT and PET analysis. The imaging data can be assessed securely from remote and analyzed by researchers with these software tools, or with tools provided by the users and installed at the server. Alternatively, QIPCM provides a service for data analysis on the imaging data according developed SOP. An example of a clinical study in which patients with unresectable pancreatic adenocarcinoma were studied with dynamic PET-FAZA for hypoxia measurement will be discussed. We successfully quantified the degree of hypoxia as well as tumor perfusion in a group of 20 patients in terms of SUV and hypoxic fraction. It was found that there is no correlation between bulk tumor perfusion and hypoxia status in this cohort. QIPCM also provides end-to-end QA testing of scanners used in multi-center clinical trials. Based on quality assurance data from multiple CT-PET scanners, we concluded that quality control of imaging was vital in the success in multi-center trials as different imaging and reconstruction parameters in PET imaging could lead to very different results in hypoxia imaging. (author)

  1. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  2. Examination of quantitative accuracy of PIXE analysis for atmospheric aerosol particle samples. PIXE analysis of NIST air particulate on filter media

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Sera, Koichiro

    2005-01-01

    In order to confirm accuracy of the direct analysis of filter samples containing atmospheric aerosol particles collected on a polycarbonate membrane filter by PIXE, we carried out PIXE analysis on a National Institute of Standards and Technology (NIST, USA) air particulate on filter media (SRM 2783). For 16 elements with NIST certified values determined by PIXE analysis - Na, Mg, Al, Si, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn and Pb - quantitative values were 80-110% relative to NIST certified values except for Na, Al, Si and Ni. Quantitative values of Na, Al and Si were 140-170% relative to NIST certified values, which were all high, and Ni was 64%. One possible reason why the quantitative values of Na, Al and Si were higher than the NIST certified values could be the difference in the X-ray spectrum analysis method used. (author)

  3. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Genetic and Quantitative Trait Locus Analysis for Bio-Oil Compounds after Fast Pyrolysis in Maize Cobs.

    Directory of Open Access Journals (Sweden)

    Brandon Jeffrey

    Full Text Available Fast pyrolysis has been identified as one of the biorenewable conversion platforms that could be a part of an alternative energy future, but it has not yet received the same attention as cellulosic ethanol in the analysis of genetic inheritance within potential feedstocks such as maize. Ten bio-oil compounds were measured via pyrolysis/gas chromatography-mass spectrometry (Py/GC-MS in maize cobs. 184 recombinant inbred lines (RILs of the intermated B73 x Mo17 (IBM Syn4 population were analyzed in two environments, using 1339 markers, for quantitative trait locus (QTL mapping. QTL mapping was performed using composite interval mapping with significance thresholds established by 1000 permutations at α = 0.05. 50 QTL were found in total across those ten traits with R2 values ranging from 1.7 to 5.8%, indicating a complex quantitative inheritance of these traits.

  5. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    van Willigen, J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The

  6. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  7. Complex pedigree analysis to detect quantitative trait loci in dairy cattle

    NARCIS (Netherlands)

    Bink, M.C.A.M.

    1998-01-01

    In dairy cattle, many quantitative traits of economic importance show phenotypic variation. For breeding purposes the analysis of this phenotypic variation and uncovering the contribution of genetic factors is very important. Usually, the individual gene effects contributing to the

  8. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    International Nuclear Information System (INIS)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon

    2002-01-01

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  9. Quantitative Risk Analysis of a Pervaporation Process for Concentrating Hydrogen Peroxide

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Ho Jin; Yoon, Ik Keun [Korea Gas Corporation, Ansan (Korea, Republic of); Choi, Soo Hyoung [Chonbuk National University, Jeonju (Korea, Republic of)

    2014-12-15

    Quantitative risk analysis has been performed for a pervaporation process for production of high test peroxide. Potential main accidents are explosion and fire caused by a decomposition reaction. As the target process has a laboratory scale, the consequence is considered to belong to Category 3. An event tree has been developed as a model for occurrence of a decomposition reaction in the target process. The probability functions of the accident causes have been established based on the frequency data of similar events. Using the constructed model, the failure rate has been calculated. The result indicates that additional safety devices are required in order to achieve an acceptable risk level, i.e. an accident frequency less than 10{sup -4}/yr. Therefore, a layer of protection analysis has been applied. As a result, it is suggested to introduce inherently safer design to avoid catalytic reaction, a safety instrumented function to prevent overheating, and a relief system that prevents explosion even if a decomposition reaction occurs. The proposed method is expected to contribute to developing safety management systems for various chemical processes including concentration of hydrogen peroxide.

  10. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  11. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    International Nuclear Information System (INIS)

    Pořízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.

    2014-01-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  12. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    Energy Technology Data Exchange (ETDEWEB)

    Pořízka, P. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Demidov, A. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Kaiser, J. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Keivanian, J. [Institute for Mining, Technical University Clausthal, Erzstraße 18, 38678 Clausthal-Zellerfeld (Germany); Gornushkin, I. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Panne, U. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Chemistry Department, Humboldt Univerisät zu Berlin, Brook-Taylor-Straße 2, D-12489 Berlin (Germany); Riedel, J., E-mail: jens.riedel@bam.de [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany)

    2014-11-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  13. An international collaborative family-based whole genome quantitative trait linkage scan for myopic refractive error

    DEFF Research Database (Denmark)

    Abbott, Diana; Li, Yi-Ju; Guggenheim, Jeremy A

    2012-01-01

    To investigate quantitative trait loci linked to refractive error, we performed a genome-wide quantitative trait linkage analysis using single nucleotide polymorphism markers and family data from five international sites....

  14. iTRAQ-Based Quantitative Proteomic Analysis of the Initiation of Head Regeneration in Planarians.

    Directory of Open Access Journals (Sweden)

    Xiaofang Geng

    Full Text Available The planarian Dugesia japonica has amazing ability to regenerate a head from the anterior ends of the amputated stump with maintenance of the original anterior-posterior polarity. Although planarians present an attractive system for molecular investigation of regeneration and research has focused on clarifying the molecular mechanism of regeneration initiation in planarians at transcriptional level, but the initiation mechanism of planarian head regeneration (PHR remains unclear at the protein level. Here, a global analysis of proteome dynamics during the early stage of PHR was performed using isobaric tags for relative and absolute quantitation (iTRAQ-based quantitative proteomics strategy, and our data are available via ProteomeXchange with identifier PXD002100. The results showed that 162 proteins were differentially expressed at 2 h and 6 h following amputation. Furthermore, the analysis of expression patterns and functional enrichment of the differentially expressed proteins showed that proteins involved in muscle contraction, oxidation reduction and protein synthesis were up-regulated in the initiation of PHR. Moreover, ingenuity pathway analysis showed that predominant signaling pathways such as ILK, calcium, EIF2 and mTOR signaling which were associated with cell migration, cell proliferation and protein synthesis were likely to be involved in the initiation of PHR. The results for the first time demonstrated that muscle contraction and ILK signaling might played important roles in the initiation of PHR at the global protein level. The findings of this research provide a molecular basis for further unraveling the mechanism of head regeneration initiation in planarians.

  15. Quantitative Analysis of Tetramethylenedisulfotetramine ("Tetramine") Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Owens, J; Hok, S; Alcaraz, A; Koester, C

    2008-11-13

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD{sub 50} = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 {micro}g/mL by LC/MS/MS versus 0.15 {micro}g/mL for GC/MS. Fortifications of the beverages at 2.5 {micro}g/mL and 0.25 {micro}g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  16. Quantitative Analysis of Tetramethylenedisulfotetramine ('Tetramine') Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    International Nuclear Information System (INIS)

    Owens, J.; Hok, S.; Alcaraz, A.; Koester, C.

    2008-01-01

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD 50 = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 (micro)g/mL by LC/MS/MS versus 0.15 (micro)g/mL for GC/MS. Fortifications of the beverages at 2.5 (micro)g/mL and 0.25 (micro)g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  17. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    International Nuclear Information System (INIS)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr; Pontuschka, W M; Mamani, J B; Costa-Filho, A J; Vieira, E D

    2008-01-01

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133 + . The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133 + cells (∼6.16 x 10 5 pg in the volume of 2 μl containing 4.5 x 10 11 SPION). The quantitative method led to the result of 1.70 x 10 -13 mol of Fe (9.5 pg), or 7.0 x 10 6 nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  18. Qualitative and Quantitative Analyses of Glycogen in Human Milk.

    Science.gov (United States)

    Matsui-Yatsuhashi, Hiroko; Furuyashiki, Takashi; Takata, Hiroki; Ishida, Miyuki; Takumi, Hiroko; Kakutani, Ryo; Kamasaka, Hiroshi; Nagao, Saeko; Hirose, Junko; Kuriki, Takashi

    2017-02-22

    Identification as well as a detailed analysis of glycogen in human milk has not been shown yet. The present study confirmed that glycogen is contained in human milk by qualitative and quantitative analyses. High-performance anion exchange chromatography (HPAEC) and high-performance size exclusion chromatography with a multiangle laser light scattering detector (HPSEC-MALLS) were used for qualitative analysis of glycogen in human milk. Quantitative analysis was carried out by using samples obtained from the individual milks. The result revealed that the concentration of human milk glycogen varied depending on the mother's condition-such as the period postpartum and inflammation. The amounts of glycogen in human milk collected at 0 and 1-2 months postpartum were higher than in milk collected at 3-14 months postpartum. In the milk from mothers with severe mastitis, the concentration of glycogen was about 40 times higher than that in normal milk.

  19. Spectroscopic and Chemometric Analysis of Binary and Ternary Edible Oil Mixtures: Qualitative and Quantitative Study.

    Science.gov (United States)

    Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica

    2016-04-19

    The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.

  20. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  1. Total quantitative recording of elemental maps and spectra with a scanning microprobe

    International Nuclear Information System (INIS)

    Legge, G.J.F.; Hammond, I.

    1979-01-01

    A system of data recording and analysis has been developed by means of which simultaneously all data from a scanning instrument such as a microprobe can be quantitatively recorded and permanently stored, including spectral outputs from several detectors. Only one scanning operation is required on the specimen. Analysis is then performed on the stored data, which contain quantitative information on distributions of all elements and spectra of all regions

  2. Use of computed tomography and automated software for quantitative analysis of the vasculature of patients with pulmonary hypertension

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Danilo Tadao; Pádua, Adriana Ignácio de; Lima Filho, Moyses Oliveira; Marin Neto, José Antonio; Elias Júnior, Jorge; Baddini-Martinez, José; Santos, Marcel Koenigkam, E-mail: danilowada@yahoo.com.br [Universidade de São Paulo (HCFMRP/USP), Ribeirão Preto, SP (Brazil). Faculdade de Medicina. Hospital das Clínicas

    2017-11-15

    Objective: To perform a quantitative analysis of the lung parenchyma and pulmonary vasculature of patients with pulmonary hypertension (PH) on computed tomography angiography (CTA) images, using automated software. Materials And Methods: We retrospectively analyzed the CTA findings and clinical records of 45 patients with PH (17 males and 28 females), in comparison with a control group of 20 healthy individuals (7 males and 13 females); the mean age differed significantly between the two groups (53 ± 14.7 vs. 35 ± 9.6 years; p = 0.0001). Results: The automated analysis showed that, in comparison with the controls, the patients with PH showed lower 10{sup th} percentile values for lung density, higher vascular volumes in the right upper lung lobe, and higher vascular volume ratios between the upper and lower lobes. In our quantitative analysis, we found no differences among the various PH subgroups. We inferred that a difference in the 10{sup th} percentile values indicates areas of hypovolaemia in patients with PH and that a difference in pulmonary vascular volumes indicates redistribution of the pulmonary vasculature and an increase in pulmonary vasculature resistance. Conclusion: Automated analysis of pulmonary vessels on CTA images revealed alterations and could represent an objective diagnostic tool for the evaluation of patients with PH. (author)

  3. UK quantitative WB-DWI technical workgroup: consensus meeting recommendations on optimisation, quality control, processing and analysis of quantitative whole-body diffusion-weighted imaging for cancer.

    Science.gov (United States)

    Barnes, Anna; Alonzi, Roberto; Blackledge, Matthew; Charles-Edwards, Geoff; Collins, David J; Cook, Gary; Coutts, Glynn; Goh, Vicky; Graves, Martin; Kelly, Charles; Koh, Dow-Mu; McCallum, Hazel; Miquel, Marc E; O'Connor, James; Padhani, Anwar; Pearson, Rachel; Priest, Andrew; Rockall, Andrea; Stirling, James; Taylor, Stuart; Tunariu, Nina; van der Meulen, Jan; Walls, Darren; Winfield, Jessica; Punwani, Shonit

    2018-01-01

    Application of whole body diffusion-weighted MRI (WB-DWI) for oncology are rapidly increasing within both research and routine clinical domains. However, WB-DWI as a quantitative imaging biomarker (QIB) has significantly slower adoption. To date, challenges relating to accuracy and reproducibility, essential criteria for a good QIB, have limited widespread clinical translation. In recognition, a UK workgroup was established in 2016 to provide technical consensus guidelines (to maximise accuracy and reproducibility of WB-MRI QIBs) and accelerate the clinical translation of quantitative WB-DWI applications for oncology. A panel of experts convened from cancer centres around the UK with subspecialty expertise in quantitative imaging and/or the use of WB-MRI with DWI. A formal consensus method was used to obtain consensus agreement regarding best practice. Questions were asked about the appropriateness or otherwise on scanner hardware and software, sequence optimisation, acquisition protocols, reporting, and ongoing quality control programs to monitor precision and accuracy and agreement on quality control. The consensus panel was able to reach consensus on 73% (255/351) items and based on consensus areas made recommendations to maximise accuracy and reproducibly of quantitative WB-DWI studies performed at 1.5T. The panel were unable to reach consensus on the majority of items related to quantitative WB-DWI performed at 3T. This UK Quantitative WB-DWI Technical Workgroup consensus provides guidance on maximising accuracy and reproducibly of quantitative WB-DWI for oncology. The consensus guidance can be used by researchers and clinicians to harmonise WB-DWI protocols which will accelerate clinical translation of WB-DWI-derived QIBs.

  4. Semi-quantitative analysis of salivary gland scintigraphy in Sjögren's syndrome diagnosis: a first-line tool.

    Science.gov (United States)

    Angusti, Tiziana; Pilati, Emanuela; Parente, Antonella; Carignola, Renato; Manfredi, Matteo; Cauda, Simona; Pizzigati, Elena; Dubreuil, Julien; Giammarile, Francesco; Podio, Valerio; Skanjeti, Andrea

    2017-09-01

    The aim of this study was the assessment of semi-quantified salivary gland dynamic scintigraphy (SGdS) parameters independently and in an integrated way in order to predict primary Sjögren's syndrome (pSS). Forty-six consecutive patients (41 females; age 61 ± 11 years) with sicca syndrome were studied by SGdS after injection of 200 MBq of pertechnetate. In sixteen patients, pSS was diagnosed, according to American-European Consensus Group criteria (AECGc). Semi-quantitative parameters (uptake (UP) and excretion fraction (EF)) were obtained for each gland. ROC curves were used to determine the best cut-off value. The area under the curve (AUC) was used to estimate the accuracy of each semi-quantitative analysis. To assess the correlation between scintigraphic results and disease severity, semi-quantitative parameters were plotted versus Sjögren's syndrome disease activity index (ESSDAI). A nomogram was built to perform an integrated evaluation of all the scintigraphic semi-quantitative data. Both UP and EF of salivary glands were significantly lower in pSS patients compared to those in non-pSS (p quantitative parameters and ESSDAI. The proposed nomogram accuracy was 87%. SGdS is an accurate and reproducible tool for the diagnosis of pSS. ESSDAI was not shown to be correlated with SGdS data. SGdS should be the first-line imaging technique in patients with suspected pSS.

  5. Qualitative and quantitative analyses of flavonoids in Spirodela polyrrhiza by high-performance liquid chromatography coupled with mass spectrometry.

    Science.gov (United States)

    Qiao, Xue; He, Wen-ni; Xiang, Cheng; Han, Jian; Wu, Li-jun; Guo, De-an; Ye, Min

    2011-01-01

    Spirodela polyrrhiza (L.) Schleid. is a traditional Chinese herbal medicine for the treatment of influenza. Despite its wide use in Chinese medicine, no report on quality control of this herb is available so far. To establish qualitative and quantitative analytical methods by high-performance liquid chromatography (HPLC) coupled with mass spectrometry (MS) for the quality control of S. polyrrhiza. The methanol extract of S. polyrrhiza was analysed by HPLC/ESI-MS(n). Flavonoids were identified by comparing with reference standards or according to their MS(n) (n = 2-4) fragmentation behaviours. Based on LC/MS data, a standardised HPLC fingerprint was established by analysing 15 batches of commercial herbal samples. Furthermore, quantitative analysis was conducted by determining five major flavonoids, namely luteolin 8-C-glucoside, apigenin 8-C-glucoside, luteolin 7-O-glucoside, apigenin 7-O-glucoside and luteolin. A total of 18 flavonoids were identified by LC/MS, and 14 of them were reported from this herb for the first time. The HPLC fingerprints contained 10 common peaks, and could differentiate good quality batches from counterfeits. The total contents of five major flavonoids in S. polyrrhiza varied significantly from 4.28 to 19.87 mg/g. Qualitative LC/MS and quantitative HPLC analytical methods were established for the comprehensive quality control of S. polyrrhiza. Copyright © 2011 John Wiley & Sons, Ltd.

  6. In vivo intraoperative hypoglossal nerve stimulation for quantitative tongue motion analysis

    NARCIS (Netherlands)

    van Alphen, M.J.A.; Eskes, M.; Smeele, L.E.; Balm, A.J.M.; Balm, Alfonsus Jacobus Maria; van der Heijden, Ferdinand

    2017-01-01

    This is the first study quantitatively measuring tongue motion in 3D after in vivo intraoperative neurostimulation of the hypoglossal nerve and its branches during a neck dissection procedure. Firstly, this study is performed to show whether this set-up is suitable for innervating different muscles

  7. Accurate quantitation of D+ fetomaternal hemorrhage by flow cytometry using a novel reagent to eliminate granulocytes from analysis.

    Science.gov (United States)

    Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond

    2014-05-01

    Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.

  8. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  9. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  10. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    Science.gov (United States)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  11. Fiscal 1998 development report on the high-accuracy quantitative analysis technique of catalyst surfaces by electron spectroscopy; 1998 nendo denshi bunkoho ni yoru shokubai hyomen koseido teiryo bunseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This project aims at development of the high-accuracy quantitative analysis technique by electron spectroscopy for surface analysis of catalysts and semiconductors. Since conventional analysis technique using an energy-fixed X-ray excitation source is inadequate to obtain satisfactory surface sensitivity and quantitative accuracy for catalysts, for development of the titled technique, this project makes experiment using energy-variable synchrotron radiation to modify the parameter on motion of low-speed electrons in solids which is obtained by Monte Carlo calculation. For establishment of the high-accuracy quantitative analysis technique of surface compositions of materials such as catalyst of which performance is dominated by utmost surface, the project studies the attenuation length of electrons in solids by electron spectroscopy using soft X-rays from synchrotron radiation. In this fiscal year, the project established the equipment and technique for high-accuracy quantitative analysis of the thickness and electron attenuation length of silicon oxide films on silicon wafers by electron spectroscopy. (NEDO)

  12. An easy and inexpensive method for quantitative analysis of endothelial damage by using vital dye staining and Adobe Photoshop software.

    Science.gov (United States)

    Saad, Hisham A; Terry, Mark A; Shamie, Neda; Chen, Edwin S; Friend, Daniel F; Holiman, Jeffrey D; Stoeger, Christopher

    2008-08-01

    We developed a simple, practical, and inexpensive technique to analyze areas of endothelial cell loss and/or damage over the entire corneal area after vital dye staining by using a readily available, off-the-shelf, consumer software program, Adobe Photoshop. The purpose of this article is to convey a method of quantifying areas of cell loss and/or damage. Descemet-stripping automated endothelial keratoplasty corneal transplant surgery was performed by using 5 precut corneas on a human cadaver eye. Corneas were removed and stained with trypan blue and alizarin red S and subsequently photographed. Quantitative assessment of endothelial damage was performed by using Adobe Photoshop 7.0 software. The average difference for cell area damage for analyses performed by 1 observer twice was 1.41%. For analyses performed by 2 observers, the average difference was 1.71%. Three masked observers were 100% successful in matching the randomized stained corneas to their randomized processed Adobe images. Vital dye staining of corneal endothelial cells can be combined with Adobe Photoshop software to yield a quantitative assessment of areas of acute endothelial cell loss and/or damage. This described technique holds promise for a more consistent and accurate method to evaluate the surgical trauma to the endothelial cell layer in laboratory models. This method of quantitative analysis can probably be generalized to any area of research that involves areas that are differentiated by color or contrast.

  13. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  14. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  15. Time-Gated Raman Spectroscopy for Quantitative Determination of Solid-State Forms of Fluorescent Pharmaceuticals.

    Science.gov (United States)

    Lipiäinen, Tiina; Pessi, Jenni; Movahedi, Parisa; Koivistoinen, Juha; Kurki, Lauri; Tenhunen, Mari; Yliruusi, Jouko; Juppo, Anne M; Heikkonen, Jukka; Pahikkala, Tapio; Strachan, Clare J

    2018-04-03

    Raman spectroscopy is widely used for quantitative pharmaceutical analysis, but a common obstacle to its use is sample fluorescence masking the Raman signal. Time-gating provides an instrument-based method for rejecting fluorescence through temporal resolution of the spectral signal and allows Raman spectra of fluorescent materials to be obtained. An additional practical advantage is that analysis is possible in ambient lighting. This study assesses the efficacy of time-gated Raman spectroscopy for the quantitative measurement of fluorescent pharmaceuticals. Time-gated Raman spectroscopy with a 128 × (2) × 4 CMOS SPAD detector was applied for quantitative analysis of ternary mixtures of solid-state forms of the model drug, piroxicam (PRX). Partial least-squares (PLS) regression allowed quantification, with Raman-active time domain selection (based on visual inspection) improving performance. Model performance was further improved by using kernel-based regularized least-squares (RLS) regression with greedy feature selection in which the data use in both the Raman shift and time dimensions was statistically optimized. Overall, time-gated Raman spectroscopy, especially with optimized data analysis in both the spectral and time dimensions, shows potential for sensitive and relatively routine quantitative analysis of photoluminescent pharmaceuticals during drug development and manufacturing.

  16. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    Science.gov (United States)

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely

  17. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  18. Quantitative phosphoproteomic analysis of early alterations in protein phosphorylation by 2,3,7,8-tetrachlorodibenzo-p-dioxin

    DEFF Research Database (Denmark)

    Schulz, Melanie; Brandner, Stefanie; Eberhagen, Carola

    2013-01-01

    A comprehensive quantitative analysis of changes in protein phosphorylation preceding or accompanying transcriptional activation by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in 5L rat hepatoma cells was performed using the SILAC approach. Following exposure of the cells to DMSO or 1 nM TCDD for 0......-induced gene activation, regulators of small GTPases of the Ras superfamily, UBX domain-containing proteins and the oncogenic protein LYRIC. The results open up new directions for research on the molecular mechanisms of dioxin action and toxicity....

  19. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  20. Quantitative HPLC-ICP-MS analysis of antimony redox speciation in complex sample matrices: new insights into the Sb-chemistry causing poor chromatographic recoveries

    DEFF Research Database (Denmark)

    Hansen, Claus; Schmidt, Bjørn; Larsen, Erik Huusfeldt

    2011-01-01

    In solution antimony exists either in the pentavalent or trivalent oxidation state. As Sb(III) is more toxic than Sb(V), it is important to be able to perform a quantitative speciation analysis of Sb’s oxidation state. The most commonly applied chromatographic methods used for this redox speciation...

  1. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    Science.gov (United States)

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  2. Three-way methods for the analysis of qualitative and quantitative two-way data.

    NARCIS (Netherlands)

    Kiers, Hendrik Albert Lambertus

    1989-01-01

    A problem often occurring in exploratory data analysis is how to summarize large numbers of variables in terms of a smaller number of dimensions. When the variables are quantitative, one may resort to Principal Components Analysis (PCA). When qualitative (categorical) variables are involved, one may

  3. Two-way and three-way approaches to ultra high performance liquid chromatography-photodiode array dataset for the quantitative resolution of a two-component mixture containing ciprofloxacin and ornidazole.

    Science.gov (United States)

    Dinç, Erdal; Ertekin, Zehra Ceren; Büker, Eda

    2016-09-01

    Two-way and three-way calibration models were applied to ultra high performance liquid chromatography with photodiode array data with coeluted peaks in the same wavelength and time regions for the simultaneous quantitation of ciprofloxacin and ornidazole in tablets. The chromatographic data cube (tensor) was obtained by recording chromatographic spectra of the standard and sample solutions containing ciprofloxacin and ornidazole with sulfadiazine as an internal standard as a function of time and wavelength. Parallel factor analysis and trilinear partial least squares were used as three-way calibrations for the decomposition of the tensor, whereas three-way unfolded partial least squares was applied as a two-way calibration to the unfolded dataset obtained from the data array of ultra high performance liquid chromatography with photodiode array detection. The validity and ability of two-way and three-way analysis methods were tested by analyzing validation samples: synthetic mixture, interday and intraday samples, and standard addition samples. Results obtained from two-way and three-way calibrations were compared to those provided by traditional ultra high performance liquid chromatography. The proposed methods, parallel factor analysis, trilinear partial least squares, unfolded partial least squares, and traditional ultra high performance liquid chromatography were successfully applied to the quantitative estimation of the solid dosage form containing ciprofloxacin and ornidazole. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Analysis of Performance Factors for Accounting and Finance Related Business Courses in A Distance Education Environment

    Directory of Open Access Journals (Sweden)

    Serdar BENLIGIRAY

    2017-07-01

    Full Text Available The objective of this study is to explore business courses performance factors with a focus on accounting and finance. Course score interrelations are assumed to represent interpretable constructs of these factors. Factor analysis is proposed to identify the constructs that explain the correlations. Factor analysis results identify three sub-groups of business core courses. The first group is labeled as management-oriented courses. Accounting, finance and economics courses are separated in two groups: the prior courses group and the subsequent courses group. The clustering order of these three groups was attributed to underlying performance factor similarities. Then, the groups are compared by the pre-assessed ratings of course specific skills and knowledge. The comparison suggests that course requirements for skills and knowledge were the latent variables for the factor analysis. Moreover, multivariate regression analyses are employed to reveal the required level of verbal and quantitative skills for the groups. Management-oriented courses are differentiated from others with requiring verbal skills, managerial skills and knowledge more. Introductory courses require quantitative and analytical reasoning skills more than the subsequent courses in accounting, finance and economics. Mathematics course score fails to be a suitable proxy of numerical processing skills as an accounting course performance factor.

  5. High-performance liquid chromatographic quantitation of desmosine plus isodesmosine in elastin and whole tissue hydrolysates

    International Nuclear Information System (INIS)

    Soskel, N.T.

    1987-01-01

    Quantitation of desmosine and isodesmosine, the major crosslinks in elastin, has been of interest because of their uniqueness and use as markers of that protein. Accurate measurement of these crosslinks may allow determination of elastin degradation in vivo and elastin content in tissues, obviating lengthy extraction procedures. We have developed a method of quantitating desmosine plus isodesmosine in hydrolysates of tissue and insoluble elastin using high-performance liquid chromatographic separation and absorbance detection that is rapid (21-35 min) and sensitive (accurate linearity from 100 pmol to 5 nmol). This method has been used to quantitate desmosines in elastin from bovine nuchal ligament and lung and in whole aorta from hamsters. The ability to completely separate [ 3 H]lysine from desmosine plus isodesmosine allows the method to be used to study incorporation of lysine into crosslinks in elastin

  6. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  7. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    Science.gov (United States)

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  8. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  9. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  10. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    Science.gov (United States)

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  11. A framework to assess management performance in district health systems: a qualitative and quantitative case study in Iran.

    Science.gov (United States)

    Tabrizi, Jafar Sadegh; Gholipour, Kamal; Iezadi, Shabnam; Farahbakhsh, Mostafa; Ghiasi, Akbar

    2018-01-01

    The aim was to design a district health management performance framework for Iran's healthcare system. The mixed-method study was conducted between September 2015 and May 2016 in Tabriz, Iran. In this study, the indicators of district health management performance were obtained by analyzing the 45 semi-structured surveys of experts in the public health system. Content validity of performance indicators which were generated in qualitative part were reviewed and confirmed based on content validity index (CVI). Also content validity ratio (CVR) was calculated using data acquired from a survey of 21 experts in quantitative part. The result of this study indicated that, initially, 81 indicators were considered in framework of district health management performance and, at the end, 53 indicators were validated and confirmed. These indicators were classified in 11 categories which include: human resources and organizational creativity, management and leadership, rules and ethics, planning and evaluation, district managing, health resources management and economics, community participation, quality improvement, research in health system, health information management, epidemiology and situation analysis. The designed framework model can be used to assess the district health management and facilitates performance improvement at the district level.

  12. A framework to assess management performance in district health systems: a qualitative and quantitative case study in Iran

    Directory of Open Access Journals (Sweden)

    Jafar Sadegh Tabrizi

    2018-04-01

    Full Text Available The aim was to design a district health management performance framework for Iran’s healthcare system. The mixed-method study was conducted between September 2015 and May 2016 in Tabriz, Iran. In this study, the indicators of district health management performance were obtained by analyzing the 45 semi-structured surveys of experts in the public health system. Content validity of performance indicators which were generated in qualitative part were reviewed and confirmed based on content validity index (CVI. Also content validity ratio (CVR was calculated using data acquired from a survey of 21 experts in quantitative part. The result of this study indicated that, initially, 81 indicators were considered in framework of district health management performance and, at the end, 53 indicators were validated and confirmed. These indicators were classified in 11 categories which include: human resources and organizational creativity, management and leadership, rules and ethics, planning and evaluation, district managing, health resources management and economics, community participation, quality improvement, research in health system, health information management, epidemiology and situation analysis. The designed framework model can be used to assess the district health management and facilitates performance improvement at the district level.

  13. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    Science.gov (United States)

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  14. Quantitative security analysis for programs with low input and noisy output

    NARCIS (Netherlands)

    Ngo, Minh Tri; Huisman, Marieke

    Classical quantitative information flow analysis often considers a system as an information-theoretic channel, where private data are the only inputs and public data are the outputs. However, for systems where an attacker is able to influence the initial values of public data, these should also be

  15. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  16. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr [Instituto Israelita de Ensino e Pesquisa Albert Einstein, IIEPAE, Sao Paulo 05651-901 (Brazil); Pontuschka, W M; Mamani, J B [Instituto de Fisica, Universidade de Sao Paulo, Sao Paulo 05315-970 (Brazil); Costa-Filho, A J; Vieira, E D [Instituto de Fisica de Sao Carlos, Universidade de Sao Paulo, Sao Carlos 13560-970 (Brazil)], E-mail: lgamarra@einstein.br

    2008-05-21

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133{sup +}. The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133{sup +} cells ({approx}6.16 x 10{sup 5} pg in the volume of 2 {mu}l containing 4.5 x 10{sup 11} SPION). The quantitative method led to the result of 1.70 x 10{sup -13} mol of Fe (9.5 pg), or 7.0 x 10{sup 6} nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  17. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  18. Quantitative analysis of rat Ig (sub)classes binding to cell surface antigens

    International Nuclear Information System (INIS)

    Nilsson, R.; Brodin, T.; Sjoegren, H.-O.

    1982-01-01

    An indirect 125 I-labeled protein A assay for detection of cell surface-bound rat immunoglobulins is presented. The assay is quantitative and rapid and detects as little as 1 ng of cell surface-bound Ig. It discriminates between antibodies belonging to different IgG subclasses, IgM and IgA. The authors describe the production and specificity control of the reagents used and show that the test can be used for quantitative analysis. A large number of sera from untreated rats are tested to evaluate the frequency of falsely positive responses and variation due to age, sex and strain of rat. With this test it is relatively easy to quantitate the binding of classes and subclasses of rat immunoglobulins in a small volume (6 μl) of untreated serum. (Auth.)

  19. Performance measurement in transport sector analysis

    Directory of Open Access Journals (Sweden)

    M. Išoraitė

    2004-06-01

    Full Text Available The article analyses the following issues: 1. Performance measurement in literature. The performance measurement has an important role to play in the efficient and effective management of organizations. Kaplan and Johnson highlighted the failure of the financial measures to reflect changes in the competitive circumstances and strategies of modern organizations. Many authors have focused attention on how organizations can design more appropriate measurement systems. Based on literature, consultancy experience and action research, numerous processes have been developed that organizations can follow in order to design and implement systems. Many frameworks have been proposed that support these processes. The objective of such frameworks is to help organizations define a set of measures that reflect their objectives and assess their performance appropriately. 2. Transport sector performance and its impacts measuring. The purpose of transport measurement is to identify opportunities enhancing transport performance. Successful transport sector management requires a system to analyze its efficiency and effectiveness as well as plan interventions if transport sector performance needs improvement. Transport impacts must be measurable and monitorable so that the person responsible for the project intervention can decide when and how to influence them. Performance indicators provide a means to measure and monitor impacts. These indicators essentially reflect quantitative and qualitative aspects of impacts at given time and places. 3. Transport sector output and input. Transport sector inputs are the resources required to deliver transport sector outputs. Transport sector inputs are typically: human resources, particularly skilled resources (including specialists consulting inputs; technology processes such as equipment and work; and finance, both public and private. 4. Transport sector policy and institutional framework; 5. Cause – effect linkages; 6

  20. Texture analysis of pulmonary parenchymateous changes related to pulmonary thromboembolism in dogs - a novel approach using quantitative methods

    DEFF Research Database (Denmark)

    Marschner, Clara Büchner; Kokla, Marietta; Amigo Rubio, Jose Manuel

    2017-01-01

    include dual energy computed tomography (DECT) as well as computer assisted diagnosis (CAD) techniques. The purpose of this study was to investigate the performance of quantitative texture analysis for detecting dogs with PTE using grey-level co-occurrence matrices (GLCM) and multivariate statistical...... classification analyses. CT images from healthy (n = 6) and diseased (n = 29) dogs with and without PTE confirmed on CTPA were segmented so that only tissue with CT numbers between −1024 and −250 Houndsfield Units (HU) was preserved. GLCM analysis and subsequent multivariate classification analyses were...... using GLCM is an effective tool for distinguishing healthy from abnormal lung. Furthermore the texture of pulmonary parenchyma in dogs with PTE is altered, when compared to the texture of pulmonary parenchyma of healthy dogs. The models’ poorer performance in classifying dogs within the diseased group...

  1. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  2. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    Science.gov (United States)

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  3. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  4. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  5. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  6. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  7. Quantitative assessment of Heart Rate Dynamics during meditation: An ECG based study with Multi-fractality and visibility graph

    Directory of Open Access Journals (Sweden)

    Anirban eBhaduri

    2016-02-01

    Full Text Available Abstract: Abstract: The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters.The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  8. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    Science.gov (United States)

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  9. QUAIL: A Quantitative Security Analyzer for Imperative Code

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Wasowski, Andrzej; Traonouez, Louis-Marie

    2013-01-01

    Quantitative security analysis evaluates and compares how effectively a system protects its secret data. We introduce QUAIL, the first tool able to perform an arbitrary-precision quantitative analysis of the security of a system depending on private information. QUAIL builds a Markov Chain model...... of the system’s behavior as observed by an attacker, and computes the correlation between the system’s observable output and the behavior depending on the private information, obtaining the expected amount of bits of the secret that the attacker will infer by observing the system. QUAIL is able to evaluate...... the safety of randomized protocols depending on secret data, allowing to verify a security protocol’s effectiveness. We experiment with a few examples and show that QUAIL’s security analysis is more accurate and revealing than results of other tools...

  10. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  11. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    Science.gov (United States)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  12. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    Science.gov (United States)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  13. Attenuated total internal reflection Fourier transform infrared spectroscopy: a quantitative approach for kidney stone analysis.

    Science.gov (United States)

    Gulley-Stahl, Heather J; Haas, Jennifer A; Schmidt, Katherine A; Evan, Andrew P; Sommer, André J

    2009-07-01

    The impact of kidney stone disease is significant worldwide, yet methods for quantifying stone components remain limited. A new approach requiring minimal sample preparation for the quantitative analysis of kidney stone components has been investigated utilizing attenuated total internal reflection Fourier transform infrared spectroscopy (ATR-FT-IR). Calcium oxalate monohydrate (COM) and hydroxylapatite (HAP), two of the most common constituents of urinary stones, were used for quantitative analysis. Calibration curves were constructed using integrated band intensities of four infrared absorptions versus concentration (weight %). The correlation coefficients of the calibration curves range from 0.997 to 0.93. The limits of detection range from 0.07 +/- 0.02% COM/HAP where COM is the analyte and HAP is the matrix, to 0.26 +/- 0.07% HAP/COM where HAP is the analyte and COM is the matrix. This study shows that linear calibration curves can be generated for the quantitative analysis of stone mixtures provided the system is well understood especially with respect to particle size.

  14. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  15. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  16. Impact of quantitative feedback and benchmark selection on radiation use by cardiologists performing cardiac angiography

    International Nuclear Information System (INIS)

    Smith, I. R.; Cameron, J.; Brighouse, R. D.; Ryan, C. M.; Foster, K. A.; Rivers, J. T.

    2013-01-01

    Audit of and feedback on both group and individual data provided immediately after the point of care and compared with realistic benchmarks of excellence have been demonstrated to drive change. This study sought to evaluate the impact of immediate benchmarked quantitative case-based performance feedback on the clinical practice of cardiologists practicing at a private hospital in Brisbane, Australia. The participating cardiologists were assigned to one of two groups: Group 1 received patient and procedural details for review and Group 2 received Group 1 data plus detailed radiation data relating to the procedures and comparative benchmarks. In Group 2, Linear-by-Linear Association analysis suggests a link between change in radiation use and initial radiation dose category (p50.014) with only those initially 'challenged' by the benchmarks showing improvement. Those not 'challenged' by the benchmarks deteriorated in performance compared with those starting well below the benchmarks showing greatest increase in radiation use. Conversely, those blinded to their radiation use (Group 1) showed general improvement in radiation use throughout the study compared with those performing initially close to the benchmarks showing greatest improvement. This study shows that use of non-challenging benchmarks in case-based radiation risk feedback does not promote a reduction in radiation use; indeed, it may contribute to increased doses. Paradoxically, cardiologists who are aware of performance monitoring but blinded to individual case data appear to maintain, if not reduce, their radiation use. (authors)

  17. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  18. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  19. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  20. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  1. Multivariate data analysis as a semi-quantitative tool for interpretive evaluation of comparability or equivalence of aerodynamic particle size distribution profiles.

    Science.gov (United States)

    Shi, Shuai; Hickey, Anthony J

    2009-01-01

    The purpose of this article is to investigate the performance of multivariate data analysis, especially orthogonal partial least square (OPLS) analysis, as a semi-quantitative tool to evaluate the comparability or equivalence of aerodynamic particle size distribution (APSD) profiles of orally inhaled and nasal drug products (OINDP). Monte Carlo simulation was employed to reconstitute APSD profiles based on 55 realistic scenarios proposed by the Product Quality Research Institute (PQRI) working group. OPLS analyses with different data pretreatment methods were performed on each of the reconstituted profiles. Compared to unit-variance scaling, equivalence determined based on OPLS analysis with Pareto scaling was shown to be more consistent with the working group assessment. Chi-square statistics was employed to compare the performance of OPLS analysis (Pareto scaling) with that of the combination test (i.e., chi-square ratio statistics and population bioequivalence test for impactor-sized mass) in terms of achieving greater consistency with the working group evaluation. A p value of 0.036 suggested that OPLS analysis with Pareto scaling may be more predictive than the combination test with respect to consistency. Furthermore, OPLS analysis may also be employed to analyze part of the APSD profiles that contribute to the calculation of the mass median aerodynamic diameter. Our results show that OPLS analysis performed on partial deposition sites do not interfere with the performance on all deposition sites.

  2. New quantitative safety standards : Different techniques, different results?

    NARCIS (Netherlands)

    Rouvroye, J.L.; Brombacher, A.C.; Lydersen, S.; Hansen, G.K.; Sandtor, H.

    1998-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many parameters can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN [DIN19250, DIN0801] no quantitative analysis was demanded. The

  3. Quantitative HPLC Analysis of an Analgesic/Caffeine Formulation: Determination of Caffeine

    Science.gov (United States)

    Ferguson, Glenda K.

    1998-04-01

    A modern high performance liquid chromatography (HPLC) laboratory experiment which entails the separation of acetaminophen, aspirin, and caffeine and the quantitative assay of caffeine in commercial mixtures of these compounds has been developed. Our HPLC protocol resolves these compounds in only three minutes with a straightforward chromatographic apparatus which consists of a C-18 column, an isocratic mobile phase, UV detection at 254 nm, and an integrator; an expensive, sophisticated system is not required. The separation is both repeatable and rapid. Moreover, the experiment can be completed in a single three-hour period. The experiment is appropriate for any chemistry student who has completed a minimum of one year of general chemistry and is ideal for an analytical or instrumental analysis course. The experiment detailed herein involves the determination of caffeine in Goody's Extra Strength Headache Powders, a commercially available medication which contains acetaminophen, aspirin, and caffeine as active ingredients. However, the separation scheme is not limited to this brand of medication nor is it limited to caffeine as the analyte. With only minor procedural modifications, students can simultaneously quantitate all of these compounds in a commercial mixture. In our procedure, students prepare a series of four caffeine standard solutions as well as a solution from a pharmaceutical analgesic/caffeine mixture, chromatographically analyze each solution in quadruplicate, and plot relative average caffeine standard peak area versus concentration. From the mathematical relationship that results, the concentration of caffeine in the commercial formulation is obtained. Finally, the absolute standard deviation of the mean concentration is calculated.

  4. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  5. Isolation and quantitation of metallothionein isoforms using reversed-phase high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Richards, M.P.; Darcey, S.E.; Steele, N.C.

    1986-01-01

    Reversed-phase HPLC (RP-HPLC) was used to isolate and quantify metallothionein (MT) isoforms from a variety of animal species and tissues. Separations were performed on C 18 radially compressed cartridge columns, eluted with a 2-step linear gradient of acetonitrile in 10 mM sodium phosphate, pH 7.0. Isoforms were detected by UV absorbance (214 nm) and by on-line interfacing with an atomic absorption spectrophotometer (HPLC-AA) to determine bound Zn, Cd and Cu. Rabbit liver and horse kidney MT's exhibited 7 distinct peaks on RP-HPLC, 2 of which were predominant (MT1 and 2). Pig liver and kidney MT2 yielded 2 subspecies on RP-HPLC, while MT1 yielded a single peak. Avian liver MT was unique from mammalian MT's in that MT2 was about tenfold more abundant than MT1. RP-HPLC and HPLC-AA were used to isolate and quantitate MT isoforms and their Zn content directly from cytosol. Quantitation was achieved by peak area integration and extrapolation from a standard curve of purified avian liver MT2. Both RP-HPLC and HPLC-AA had a lower detection limit of 1 + g of peptide and .1 μg of Zn. Recoveries (92-98%) were determined with labeled ( 35 S) MT and MT of known Zn content. Cytoplasmic MT-Zn in avian embryo hepatocytes cultured with added Zn was quantitated using HPLC-AA. In conclusion, both RP-HPLC and HPLC-AA are rapid and powerful separation techniques for the isolation, quantitation and characterization of the isoproteins comprising the MT gene family

  6. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  7. Miniaturization of Fresnel lenses for solar concentration: a quantitative investigation.

    Science.gov (United States)

    Duerr, Fabian; Meuret, Youri; Thienpont, Hugo

    2010-04-20

    Sizing down the dimensions of solar concentrators for photovoltaic applications offers a number of promising advantages. It provides thinner modules and smaller solar cells, which reduces thermal issues. In this work a plane Fresnel lens design is introduced that is first analyzed with geometrical optics. Because of miniaturization, pure ray tracing may no longer be valid to determine the concentration performance. Therefore, a quantitative wave optical analysis of the miniaturization's influence on the obtained concentration performance is presented. This better quantitative understanding of the impact of diffraction in microstructured Fresnel lenses might help to optimize the design of several applications in nonimaging optics.

  8. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease.

    Science.gov (United States)

    van Gilst, Merel M; van Mierlo, Petra; Bloem, Bastiaan R; Overeem, Sebastiaan

    2015-10-01

    Many people with Parkinson disease experience "sleep benefit": temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. © 2015 Associated Professional Sleep Societies, LLC.

  9. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  10. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  11. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  12. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  13. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  14. Quantitative Determination of Compounds from Akebia quinata by High-Performance Liquid Chromatography

    International Nuclear Information System (INIS)

    Yen, Nguyen; Thu, Nguyen; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun; Lee, Jae Hyun; Kim, Jeong Ah; Son, Jong Keun; Choi, Jae Sui; Woo, Eun Rhan

    2014-01-01

    To provide the scientific corroboration of the traditional uses of Akebia quinata (Thunb.) Decne., a detailed analytical examination of A. quinata stems was carried out using a reversed-phase high performance liquid chromatography (RP-HPLC) method coupled to photodiode array detector (PDA) for the simultaneous determination of four phenolic substances; cuneataside D, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside, 3-caffeoylquinic acid and calceolarioside B. Particular attention was focused on the main compound, 3-caffeoylquinic acid, which has a range of biological functions. In addition, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside was considered as a discernible marker of A. quinata from its easy confuse plants. The contents of compounds 2 and 3 ranged from 0.72 to 2.68 mg/g and from 1.66 to 5.64 mg/g, respectively. The validation data indicated that this HPLC/PDA assay was used successfully to quantify the four phenolic compounds in A. quinata from different locations using relatively simple conditions and procedures. The pattern-recognition analysis data from 53 samples classified them into two groups, allowing discrimination between A. quinata and comparable herbs. The results suggest that the established HPLC/PDA method is suitable for quantitation and pattern-recognition analyses for a quality evaluation of this medicinal herb

  15. Quantitative Determination of Compounds from Akebia quinata by High-Performance Liquid Chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Yen, Nguyen; Thu, Nguyen; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun [Catholic Univ. of Daegu, Gyeongsan (Korea, Republic of); Lee, Jae Hyun [Dongguk Univ., Yongin (Korea, Republic of); Kim, Jeong Ah [Kyungpook National Univ., Daegu (Korea, Republic of); Son, Jong Keun [Yeungnam Univ., Gyeongsan (Korea, Republic of); Choi, Jae Sui [Pukyung National Univ., Busan (Korea, Republic of); Woo, Eun Rhan [Chosun Univ., Gwangju (Korea, Republic of)

    2014-07-15

    To provide the scientific corroboration of the traditional uses of Akebia quinata (Thunb.) Decne., a detailed analytical examination of A. quinata stems was carried out using a reversed-phase high performance liquid chromatography (RP-HPLC) method coupled to photodiode array detector (PDA) for the simultaneous determination of four phenolic substances; cuneataside D, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside, 3-caffeoylquinic acid and calceolarioside B. Particular attention was focused on the main compound, 3-caffeoylquinic acid, which has a range of biological functions. In addition, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside was considered as a discernible marker of A. quinata from its easy confuse plants. The contents of compounds 2 and 3 ranged from 0.72 to 2.68 mg/g and from 1.66 to 5.64 mg/g, respectively. The validation data indicated that this HPLC/PDA assay was used successfully to quantify the four phenolic compounds in A. quinata from different locations using relatively simple conditions and procedures. The pattern-recognition analysis data from 53 samples classified them into two groups, allowing discrimination between A. quinata and comparable herbs. The results suggest that the established HPLC/PDA method is suitable for quantitation and pattern-recognition analyses for a quality evaluation of this medicinal herb.

  16. Quantitative analysis of Esophageal Transit of Radionuclide in Patients with Dermatomyositis-Polymyositis

    International Nuclear Information System (INIS)

    Chung, June Key; Lee, Myung Chul; Koh, Chang Soon; Lee, Myung Hae

    1989-01-01

    Esophageal transit of radionuclide was quantitatively analyzed in 29 patients with dermatomyositis-polymyositis Fourteen patients (48.3%) showed retention of tracer in oropharynx. The mean value of percent retention of oropharynx was 15.5+16.6%. Esophageal dysfunction was found in 19 patients (65.5%). Among them 4 showed mild, 12 showed moderate and 3 showed severe esophageal dysfunction. Dysphagia was found in 11 patients (37.9%), which was closely related to percent retention of oropharynx. Quantitative analysis of esophageal transit of radionuclide seemed to be a useful technique for evaluation of dysphagia in patients with dermatomyositis-polymyositis.

  17. Quantitative Analysis of Adulterations in Oat Flour by FT-NIR Spectroscopy, Incomplete Unbalanced Randomized Block Design, and Partial Least Squares

    Directory of Open Access Journals (Sweden)

    Ning Wang

    2014-01-01

    Full Text Available This paper developed a rapid and nondestructive method for quantitative analysis of a cheaper adulterant (wheat flour in oat flour by NIR spectroscopy and chemometrics. Reflectance FT-NIR spectra in the range of 4000 to 12000 cm−1 of 300 oat flour objects adulterated with wheat flour were measured. The doping levels of wheat flour ranged from 5% to 50% (w/w. To ensure the generalization performance of the method, both the oat and the wheat flour samples were collected from different producing areas and an incomplete unbalanced randomized block (IURB design was performed to include the significant variations that may be encountered in future samples. Partial least squares regression (PLSR was used to develop calibration models for predicting the levels of wheat flour. Different preprocessing methods including smoothing, taking second-order derivative (D2, and standard normal variate (SNV transformation were investigated to improve the model accuracy of PLS. The root mean squared error of Monte Carlo cross-validation (RMSEMCCV and root mean squared error of prediction (RMSEP were 1.921 and 1.975 (%, w/w by D2-PLS, respectively. The results indicate that NIR and chemometrics can provide a rapid method for quantitative analysis of wheat flour in oat flour.

  18. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  19. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  20. Smile line assessment comparing quantitative measurement and visual estimation.

    Science.gov (United States)

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.