WorldWideScience

Sample records for perform quantitative analysis

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  3. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  4. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  5. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  6. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  7. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, J., E-mail: takahashi.3ct.jun@jp.nssmc.com [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Kawakami, K. [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Raabe, D. [Max-Planck Institut für Eisenforschung GmbH, Department for Microstructure Physics and Alloy Design, Max-Planck-Str. 1, 40237 Düsseldorf (Germany)

    2017-04-15

    Highlights: • Quantitative analysis in Fe-Cu alloy was investigated in voltage and laser atom probe. • In voltage-mode, apparent Cu concentration exceeded actual concentration at 20–40 K. • In laser-mode, the concentration never exceeded the actual concentration even at 20 K. • Detection loss was prevented due to the rise in tip surface temperature in laser-mode. • Preferential evaporation of solute Cu was reduced in laser-mode. - Abstract: The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40 K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20 K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions.

  8. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  9. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    Science.gov (United States)

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  10. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe.

    Science.gov (United States)

    Takahashi, J; Kawakami, K; Raabe, D

    2017-04-01

    The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    Science.gov (United States)

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation forensic toxicology laboratory. PMID:27635251

  12. High-performance hybrid Orbitrap mass spectrometers for quantitative proteome analysis

    DEFF Research Database (Denmark)

    Williamson, James C; Edwards, Alistair V G; Verano-Braga, Thiago

    2016-01-01

    We present basic workups and quantitative comparisons for two current generation Orbitrap mass spectrometers, the Q Exactive Plus and Orbitrap Fusion Tribrid, which are widely considered two of the highest performing instruments on the market. We assessed the performance of two quantitative methods...... on both instruments, namely label-free quantitation and stable isotope labeling using isobaric tags, for studying the heat shock response in Escherichia coli. We investigated the recently reported MS3 method on the Fusion instrument and the potential of MS3-based reporter ion isolation Synchronous...... Precursor Selection (SPS) and its impact on quantitative accuracy. We confirm that the label-free approach offers a more linear response with a wider dynamic range than MS/MS-based isobaric tag quantitation and that the MS3/SPS approach alleviates but does not eliminate dynamic range compression. We...

  13. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  14. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  15. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  16. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    Science.gov (United States)

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  17. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. Quantitative analysis and design of a spray aerosol inhaler. Part 2: improvements in mouthpiece performance.

    Science.gov (United States)

    Hindle, Michael; Longest, P Worth

    2013-10-01

    The objective of this study was to utilize previously identified critical design attributes for the capillary aerosol generator as a model spray inhaler in order to develop a second-generation device that minimized aerosol drug deposition in the mouthpiece. Computational fluid dynamics (CFD) predictive analysis of the critical design attributes indicated that turbulence intensity should be reduced and the effective mouthpiece diameter should be increased. Two second-generation inhaler mouthpieces meeting these specifications were manufactured and tested. The first device (Design 1) implemented a larger cross-sectional area in the mouthpiece and streamlined flow, whereas the second device (Design 2) used a perforated mouthpiece wall. An in vitro deposition study was performed to quantify the deposition of drug mass in the mouthpieces and connected induction ports, and the results were compared with the CFD predictions. The two second-generation mouthpieces reduced in vitro aerosol deposition from the original value of 7.8% to values of 2.1% (Device 1) and 4.3% (Device 2), without largely altering the induction port deposition. This was achieved by design alterations aimed at reducing turbulence intensity and increasing the effective mouthpiece diameter. CFD model predictions were in good agreement with the in vitro experimental data. A second-generation spray inhaler mouthpiece with low drug deposition was developed using a predictive CFD model and in vitro experiments. Applying this quantitative analysis and design methodology to medical devices, which is similar to the Quality by Design paradigm, could provide significant advantages compared with traditional approaches.

  20. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  1. A Quantitative Analysis for the Correlation Between Corporate Financial and Social Performance

    Directory of Open Access Journals (Sweden)

    Wafaa Salah

    2016-12-01

    Full Text Available Recently, the corporate social performance (CSP is not less important than the corporate financial performance (CFP. Debate still exists about the nature of the relationship between the CSP and CFP, whether it is a positive, negative or a neutral correlation. The objective of this study is to explore the relationship between corporate social responsibility (CSR reports and CFP. The study uses the accounting-based and market-based quantitative measures to quantify the financial performance of seven organizations listed on the Egyptian Stock Exchange in 2007-2014. Then uses the information retrieval technologies to quantify the contribution of each of the three dimensions of the corporate social responsibility report (environmental, social and economic. Finally, the correlation between these two sets of variables is viewed together in a model to detect the correlations between them. This model is applied on seven firms that generate social responsibility reports. The results show a positive correlation between the Earnings per share (market-based measure and the economical dimension in the CSR report. On the other hand, total assets and property, plant and equipment (accounting-based measure are positively correlated to the environmental and social dimensions of the CSR reports. While there is not any significant relationship between ROA, ROE, Operating income and corporate social responsibility. This study contributes to the literature by providing more clarification of the relationship between CFP and the isolated CSR activities in a developing country.

  2. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  3. Quantitative evaluation of visual detection performance in medicine: ROC analysis and determination of diagnostic benefit

    International Nuclear Information System (INIS)

    Metz, C.E.; Starr, S.J.; Lusted, L.B.

    1976-01-01

    An ROC curve provides an empirical description of the trade-offs which are possible among the various types of correct and incorrect decisions as the human decision-maker varies one or more confidence thresholds. Conventional ROC curves measured in simple decision-making situations can, in some cases, be used to predict human decision performance in more complex situations. By considering both the consequences of the various types of diagnostic decisions and the overhead cost of a diagnostic study, one can use the ROC curve to evaluate the diagnostic usefulness of a study in any particular clinical context. Since the ROC curve describes the possible relationships among the probabilities of the various types of correct and incorrect decisions, it plays a central role in optimizing diagnostic strategies using the general techniques of decision analysis. Applications in radiographic image evaluation are described

  4. Quantitative analysis of total retronecine esters-type pyrrolizidine alkaloids in plant by high performance liquid chromatography

    International Nuclear Information System (INIS)

    Zhang Fang; Wang Changhong; Xiong Aizhen; Wang Wan; Yang Li; Branford-White, Christopher J.; Wang Zhengtao; Bligh, S.W. Annie

    2007-01-01

    Pyrrolizidine alkaloids (PAs) are alkaloids which typically contain a necine (7-hydroxy-1-hydroxymethyl-6,7-dihydro-5H-pyrrolizidine) base unit, and they can be found in one third of the higher plants around the world. They are hepatotoxic, mutagenic and carcinogenic and pose a threat to human health and safety. A specific, quick and sensitive method is therefore needed to detect and quantify the PAs sometimes in trace amount in herbs, tea or food products. Based on high performance liquid chromatography with prior derivatization of the alkaloids using o-chloranil and Ehrlich's reagent, we report an improved method for quantitative analysis of the total amount of retronecine esters-type pyrrolizidine alkaloids (RET-PAs) in a plant extract. The total quantitation of RET-PAs is achieved because of a common colored retronecine marker, a 7-ethoxy-1-ethoxylmethyl retronecine derivative, is produced with all the different RET-PAs during the derivatization reaction. The chemical identity of the common retronecine marker was characterized on-line by positive mode electrospray ionization mass spectrometry and nuclear magnetic resonance spectroscopy. The limit of detection using the improved method is 0.26 nmol mL -1 and the limit of quantitation is 0.79 nmol mL -1 . The advantages of this method are much enhanced sensitivity in detection and quantitation, and, no restriction on the choice of RET-PA as a calibration standard. Application of the developed method to the quantitation of total RET esters-type PAs in Senecio scandens from different regions of China is also reported

  5. Quantitative analysis of total retronecine esters-type pyrrolizidine alkaloids in plant by high performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Fang; Wang Changhong; Xiong Aizhen; Wang Wan; Yang Li [Key Laboratory of Standardization of Chinese Medicines of Ministry of Education, Shanghai University of Traditional Chinese Medicine, 1200 Cai Lun Road, Zhangjiang Hi-Tech Park, Shanghai 201203 (China); Branford-White, Christopher J. [Institute for Health Research and Policy, London Metropolitan University, 166-220 Holloway Road, London N7 8DB (United Kingdom); Wang Zhengtao [Key Laboratory of Standardization of Chinese Medicines of Ministry of Education, Shanghai University of Traditional Chinese Medicine, 1200 Cai Lun Road, Zhangjiang Hi-Tech Park, Shanghai 201203 (China); School of Chinese Pharmacy, China Pharmaceutical University, Nanjing 210038 (China)], E-mail: wangzt@shutcm.edu.cn; Bligh, S.W. Annie [Institute for Health Research and Policy, London Metropolitan University, 166-220 Holloway Road, London N7 8DB (United Kingdom)], E-mail: a.bligh@londonmet.ac.uk

    2007-12-12

    Pyrrolizidine alkaloids (PAs) are alkaloids which typically contain a necine (7-hydroxy-1-hydroxymethyl-6,7-dihydro-5H-pyrrolizidine) base unit, and they can be found in one third of the higher plants around the world. They are hepatotoxic, mutagenic and carcinogenic and pose a threat to human health and safety. A specific, quick and sensitive method is therefore needed to detect and quantify the PAs sometimes in trace amount in herbs, tea or food products. Based on high performance liquid chromatography with prior derivatization of the alkaloids using o-chloranil and Ehrlich's reagent, we report an improved method for quantitative analysis of the total amount of retronecine esters-type pyrrolizidine alkaloids (RET-PAs) in a plant extract. The total quantitation of RET-PAs is achieved because of a common colored retronecine marker, a 7-ethoxy-1-ethoxylmethyl retronecine derivative, is produced with all the different RET-PAs during the derivatization reaction. The chemical identity of the common retronecine marker was characterized on-line by positive mode electrospray ionization mass spectrometry and nuclear magnetic resonance spectroscopy. The limit of detection using the improved method is 0.26 nmol mL{sup -1} and the limit of quantitation is 0.79 nmol mL{sup -1}. The advantages of this method are much enhanced sensitivity in detection and quantitation, and, no restriction on the choice of RET-PA as a calibration standard. Application of the developed method to the quantitation of total RET esters-type PAs in Senecio scandens from different regions of China is also reported.

  6. WE-G-207-05: Relationship Between CT Image Quality, Segmentation Performance, and Quantitative Image Feature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J; Nishikawa, R [University of Pittsburgh, Pittsburgh, PA (United States); Reiser, I [The University of Chicago, Chicago, IL (United States); Boone, J [UC Davis Medical Center, Sacramento, CA (United States)

    2015-06-15

    Purpose: Segmentation quality can affect quantitative image feature analysis. The objective of this study is to examine the relationship between computed tomography (CT) image quality, segmentation performance, and quantitative image feature analysis. Methods: A total of 90 pathology proven breast lesions in 87 dedicated breast CT images were considered. An iterative image reconstruction (IIR) algorithm was used to obtain CT images with different quality. With different combinations of 4 variables in the algorithm, this study obtained a total of 28 different qualities of CT images. Two imaging tasks/objectives were considered: 1) segmentation and 2) classification of the lesion as benign or malignant. Twenty-three image features were extracted after segmentation using a semi-automated algorithm and 5 of them were selected via a feature selection technique. Logistic regression was trained and tested using leave-one-out-cross-validation and its area under the ROC curve (AUC) was recorded. The standard deviation of a homogeneous portion and the gradient of a parenchymal portion of an example breast were used as an estimate of image noise and sharpness. The DICE coefficient was computed using a radiologist’s drawing on the lesion. Mean DICE and AUC were used as performance metrics for each of the 28 reconstructions. The relationship between segmentation and classification performance under different reconstructions were compared. Distributions (median, 95% confidence interval) of DICE and AUC for each reconstruction were also compared. Results: Moderate correlation (Pearson’s rho = 0.43, p-value = 0.02) between DICE and AUC values was found. However, the variation between DICE and AUC values for each reconstruction increased as the image sharpness increased. There was a combination of IIR parameters that resulted in the best segmentation with the worst classification performance. Conclusion: There are certain images that yield better segmentation or classification

  7. Utility of DWI with quantitative ADC values in ovarian tumors: a meta-analysis of diagnostic test performance.

    Science.gov (United States)

    Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui

    2018-01-01

    Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.

  8. Qualitative and quantitative analysis of anthraquinones in rhubarbs by high performance liquid chromatography with diode array detector and mass spectrometry.

    Science.gov (United States)

    Wei, Shao-yin; Yao, Wen-xin; Ji, Wen-yuan; Wei, Jia-qi; Peng, Shi-qi

    2013-12-01

    Rhubarb is well known in traditional Chinese medicines (TCMs) mainly due to its effective purgative activity. Anthraquinones, including anthraquinone derivatives and their glycosides, are thought to be the major active components in rhubarb. To improve the quality control method of rhubarb, we studied on the extraction method, and did qualitative and quantitative analysis of widely used rhubarbs, Rheum tanguticum Maxim. ex Balf. and Rheum palmatum L., by HPLC-photodiode array detection (HPLC-DAD) and HPLC-mass spectrum (HPLC-MS) on a Waters SymmetryShield RP18 column (250 mm × 4.6 mm i.d., 5 μm). Amount of five anthraquinones was viewed as the evaluating standard. A standardized characteristic fingerprint of rhubarb was provided. From the quantitative analysis, the rationality was demonstrated for ancestors to use these two species of rhubarb equally. Under modern extraction methods, the amount of five anthraquinones in Rheum tanguticum Maxim. ex Balf. is higher than that in Rheum palmatum L. Among various extraction methods, ultrasonication with 70% methanol for 30 min is a promising one. For HPLC analysis, mobile phase consisted of methanol and 0.1% phosphoric acid in water with a gradient program, the detection wavelength at 280nm for fingerprinting analysis and 254 nm for quantitative analysis are good choices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Quantitative and pattern recognition analysis of five marker compounds in Raphani semen using high-performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Yeon Woo; Lee, Joo Sang; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun [College of Pharmacy, Drug Research and Development Center, Catholic University of Daegu, Gyeongsan (Korea, Republic of); Kim, Jeong Ah [College of Pharmacy, Research Institute of Pharmaceutical Sciences, Kyungpook National University, Daegu (Korea, Republic of); Eun, Rhan Woo [College of Pharmacy, Chosun University, Gwangju (Korea, Republic of)

    2015-09-15

    A rapid and simple high-performance liquid chromatography (HPLC)-photodiode array (PDA) analytical method was developed for the quantitative analysis of Raphani Semen (RS). This method was successfully used to determine the five main phenolic compounds found in RS specimens from different production regions. The compounds included sinapine thiocyanate (1), β-d-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (2), isorhamnetin 3,4′-di-O-β-d-glucoside (3), β-d-(3-O-sinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (4), and β-d-(3,4-O-disinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (5). The marker compounds were separated using an Agilent Eclipse XDB-C18 column (5.0 µm, 150 × 4.6 mm i.d.) by gradient elution with acetonitrile/water/0.1% trifluoroacetic acid (TFA) as the mobile phase (flow rate, 1.0 mL/min). This method was fully validated with respect to linearity, precision, accuracy, stability, and robustness. The HPLC analytical method was validated to conduct a pattern recognition analysis by repeatedly analyzing 56 seed samples including 55 RS (C01–C49 and K50–K55) and 1 Brassicae Semen samples. In addition, a content standard for RS was proposed. Compounds 1 and 4 were revealed as major components in the HPLC chromatogram, and their contents ranged from 0.06 to 0.20 and 0.02 to 0.35 mg/g, respectively. These results demonstrate the successful development of an analytical method suitable for evaluating the quality and distinguishing the origin of RS. In addition, we briefly describe the crucial liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical conditions for the precise simultaneous quantification of the marker compounds.

  10. Quantitative and pattern recognition analysis of five marker compounds in Raphani semen using high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Jung, Yeon Woo; Lee, Joo Sang; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun; Kim, Jeong Ah; Eun, Rhan Woo

    2015-01-01

    A rapid and simple high-performance liquid chromatography (HPLC)-photodiode array (PDA) analytical method was developed for the quantitative analysis of Raphani Semen (RS). This method was successfully used to determine the five main phenolic compounds found in RS specimens from different production regions. The compounds included sinapine thiocyanate (1), β-d-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (2), isorhamnetin 3,4′-di-O-β-d-glucoside (3), β-d-(3-O-sinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (4), and β-d-(3,4-O-disinapoyl)-fructofuranosyl-α-d-(6-O-sinapoyl)-glucopyranoside (5). The marker compounds were separated using an Agilent Eclipse XDB-C18 column (5.0 µm, 150 × 4.6 mm i.d.) by gradient elution with acetonitrile/water/0.1% trifluoroacetic acid (TFA) as the mobile phase (flow rate, 1.0 mL/min). This method was fully validated with respect to linearity, precision, accuracy, stability, and robustness. The HPLC analytical method was validated to conduct a pattern recognition analysis by repeatedly analyzing 56 seed samples including 55 RS (C01–C49 and K50–K55) and 1 Brassicae Semen samples. In addition, a content standard for RS was proposed. Compounds 1 and 4 were revealed as major components in the HPLC chromatogram, and their contents ranged from 0.06 to 0.20 and 0.02 to 0.35 mg/g, respectively. These results demonstrate the successful development of an analytical method suitable for evaluating the quality and distinguishing the origin of RS. In addition, we briefly describe the crucial liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical conditions for the precise simultaneous quantification of the marker compounds

  11. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  12. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  13. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  14. Quantitative analysis of phylloquinone (vitamin K1) in soy bean oils by high-performance liquid chromatography.

    Science.gov (United States)

    Zonta, F; Stancher, B

    1985-07-19

    A high-performance liquid chromatographic method for determining phylloquinone (vitamin K1) in soy bean oils is described. Resolution of vitamin K1 from interfering peaks of the matrix was obtained after enzymatic digestion, extraction and liquid-solid chromatography on alumina. An isocratic reversed-phase chromatography with UV detection was used in the final stage. The quantitation was carried out by the standard addition method, and the recovery of the whole procedure was 88.2%.

  15. Brain wave correlates of attentional states: Event related potentials and quantitative EEG analysis during performance of cognitive and perceptual tasks

    Science.gov (United States)

    Freeman, Frederick G.

    1993-01-01

    presented target stimulus. In addition to the task requirements, irrelevant tones were presented in the background. Research has shown that even though these stimuli are not attended, ERP's to them can still be elicited. The amplitude of the ERP waves has been shown to change as a function of a person's level of alertness. ERP's were also collected and analyzed for the target stimuli for each task. Brain maps were produced based on the ERP voltages for the different stimuli. In addition to the ERP's, a quantitative EEG (QEEG) was performed on the data using a fast Fourier technique to produce a power spectral analysis of the EEG. This analysis was conducted on the continuous EEG while the subjects were performing the tasks. Finally, a QEEG was performed on periods during the task when subjects indicated that they were in an altered state of awareness. During the tasks, subjects were asked to indicate by pressing a button when they realized their level of task awareness had changed. EEG epochs were collected for times just before and just after subjects made this reponse. The purpose of this final analysis was to determine whether or not subjective indices of level of awareness could be correlated with different patterns of EEG.

  16. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  17. Chemical Fingerprint and Quantitative Analysis for the Quality Evaluation of Platycladi cacumen by Ultra-performance Liquid Chromatography Coupled with Hierarchical Cluster Analysis.

    Science.gov (United States)

    Shan, Mingqiu; Li, Sam Fong Yau; Yu, Sheng; Qian, Yan; Guo, Shuchen; Zhang, Li; Ding, Anwei

    2018-01-01

    Platycladi cacumen (dried twigs and leaves of Platycladus orientalis (L.) Franco) is a frequently utilized Chinese medicinal herb. To evaluate the quality of the phytomedcine, an ultra-performance liquid chromatographic method with diode array detection was established for chemical fingerprinting and quantitative analysis. In this study, 27 batches of P. cacumen from different regions were collected for analysis. A chemical fingerprint with 20 common peaks was obtained using Similarity Evaluation System for Chromatographic Fingerprint of Traditional Chinese Medicine (Version 2004A). Among these 20 components, seven flavonoids (myricitrin, isoquercitrin, quercitrin, afzelin, cupressuflavone, amentoflavone and hinokiflavone) were identified and determined simultaneously. In the method validation, the seven analytes showed good regressions (R ≥ 0.9995) within linear ranges and good recoveries from 96.4% to 103.3%. Furthermore, with the contents of these seven flavonoids, hierarchical clustering analysis was applied to distinguish the 27 batches into five groups. The chemometric results showed that these groups were almost consistent with geographical positions and climatic conditions of the production regions. Integrating fingerprint analysis, simultaneous determination and hierarchical clustering analysis, the established method is rapid, sensitive, accurate and readily applicable, and also provides a significant foundation for quality control of P. cacumen efficiently. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  19. Quantitative Concept Analysis

    NARCIS (Netherlands)

    Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas

    2012-01-01

    Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all

  20. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  1. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  2. Fingerprint analysis, multi-component quantitation, and antioxidant activity for the quality evaluation of Salvia miltiorrhiza var. alba by high-performance liquid chromatography and chemometrics.

    Science.gov (United States)

    Zhang, Danlu; Duan, Xiaoju; Deng, Shuhong; Nie, Lei; Zang, Hengchang

    2015-10-01

    Salvia miltiorrhiza Bge. var. alba C.Y. Wu and H.W. Li has wide prospects in clinical practice. A useful comprehensive method was developed for the quality evaluation of S. miltiorrhiza var. alba by three quantitative parameters: high-performance liquid chromatography fingerprint, ten-component contents, and antioxidant activity. The established method was validated for linearity, precision, repeatability, stability, and recovery. Principal components analysis and hierarchical clustering analysis were both used to evaluate the quality of the samples from different origins. The results showed that there were category discrepancies in quality of S. miltiorrhiza var. alba samples according to the three quantitative parameters. Multivariate linear regression was adopted to explore the relationship between components and antioxidant activity. Three constituents, namely, danshensu, rosmarinic acid, and salvianolic acid B, significantly correlated with antioxidant activity, and were successfully elucidated by the optimized multivariate linear regression model. The combined use of high-performance liquid chromatography fingerprint analysis, simultaneous multicomponent quantitative analysis, and antioxidant activity for the quality evaluation of S. miltiorrhiza var. alba is a reliable, comprehensive, and promising approach, which might provide a valuable reference for other herbal products in general to improve their quality control. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  4. Qualitative and quantitative analysis of branches in dextran using high-performance anion exchange chromatography coupled to quadrupole time-of-flight mass spectrometry.

    Science.gov (United States)

    Yi, Lin; Ouyang, Yilan; Sun, Xue; Xu, Naiyu; Linhardt, Robert J; Zhang, Zhenqing

    2015-12-04

    Dextran, a family of natural polysaccharides, consists of an α (1→6) linked-glucose main (backbone) chain having a number of branches. The determination of the types and the quantities of branches in dextran is important in understanding its various biological roles. In this study, a hyphenated method using high-performance anion exchange chromatography (HPAEC) in parallel with pulsed amperometric detection (PAD) and mass spectrometry (MS) was applied to qualitative and quantitative analysis of dextran branches. A rotary cation-exchange cartridge array desalter was used for removal of salt from the HPAEC eluent making it MS compatible. MS and MS/MS were used to provide structural information on the enzymatically prepared dextran oligosaccharides. PAD provides quantitative data on the ratio of enzyme-resistant, branched dextran oligosaccharides. Both the types and degree of branching found in a variety of dextrans could be simultaneously determined online using this method. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Rapid quantitative analysis of individual anthocyanin content based on high-performance liquid chromatography with diode array detection with the pH differential method.

    Science.gov (United States)

    Wang, Huayin

    2014-09-01

    A new quantitative technique for the simultaneous quantification of the individual anthocyanins based on the pH differential method and high-performance liquid chromatography with diode array detection is proposed in this paper. The six individual anthocyanins (cyanidin 3-glucoside, cyanidin 3-rutinoside, petunidin 3-glucoside, petunidin 3-rutinoside, and malvidin 3-rutinoside) from mulberry (Morus rubra) and Liriope platyphylla were used for demonstration and validation. The elution of anthocyanins was performed using a C18 column with stepwise gradient elution and individual anthocyanins were identified by high-performance liquid chromatography with tandem mass spectrometry. Based on the pH differential method, the high-performance liquid chromatography peak areas of maximum and reference absorption wavelengths of anthocyanin extracts were conducted to quantify individual anthocyanins. The calibration curves for these anthocyanins were linear within the range of 10-5500 mg/L. The correlation coefficients (r(2)) all exceeded 0.9972, and the limits of detection were in the range of 1-4 mg/L at a signal-to-noise ratio ≥5 for these anthocyanins. The proposed quantitative analysis was reproducible with good accuracy of all individual anthocyanins ranging from 96.3 to 104.2% and relative recoveries were in the range 98.4-103.2%. The proposed technique is performed without anthocyanin standards and is a simple, rapid, accurate, and economical method to determine individual anthocyanin contents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Quantitative Analysis on the Influence Factors of the Sustainable Water Resource Management Performance in Irrigation Areas: An Empirical Research from China

    Directory of Open Access Journals (Sweden)

    Hulin Pan

    2018-01-01

    Full Text Available Performance evaluation and influence factors analysis are vital to the sustainable water resources management (SWRM in irrigation areas. Based on the objectives and the implementation framework of modern integrated water resources management (IWRM, this research systematically developed an index system of the performances and their influence factors ones of the SWRM in irrigation areas. Using the method of multivariate regression combined with correlation analysis, this study estimated quantitatively the effect of multiple factors on the water resources management performances of irrigation areas in the Ganzhou District of Zhangye, Gansu, China. The results are presented below. The overall performance is mainly affected by management enabling environment and management institution with the regression coefficients of 0.0117 and 0.0235, respectively. The performance of ecological sustainability is mainly influenced by local economic development level and enable environment with the regression coefficients of 0.08642 and −0.0118, respectively. The performance of water use equity is mainly influenced by information publicity, administrators’ education level and ordinary water users’ participation level with the correlation coefficients of 0.637, 0.553 and 0.433, respectively. The performance of water use economic efficiency is mainly influenced by the management institutions and instruments with the regression coefficients of −0.07844 and 0.01808, respectively. In order to improve the overall performance of SWRM in irrigation areas, it is necessary to strengthen the public participation, improve the manager’ ability and provide sufficient financial support on management organization.

  7. Quantitative analysis of cocaine and its metabolites in whole blood and urine by high-performance liquid chromatography coupled with tandem mass spectrometry.

    Science.gov (United States)

    Johansen, Sys Stybe; Bhatia, Helle Merete

    2007-06-01

    In forensic toxicology it is important to have specific and sensitive analysis for quantification of illicit drugs in biological matrices. This paper describes a quantitative method for determination of cocaine and its major metabolites (ecgonine methyl ester, benzoylecgonine, norcocaine and ethylene cocaine) in whole blood and urine by liquid chromatography coupled with tandem mass spectrometry LC/MS/MS. The sample pre-treatment (0.20 g) consisted of acid precipitation, followed by centrifugation and solid phase extraction of supernatant using mixed mode sorbent columns (SPEC MP1 Ansys Diag. Inc.). Chromatographic separation was performed at 30 degrees C on a reverse phase Zorbax C18 column with a gradient system consisting of formic acid, water and acetonitrile. The analysis was performed by positive electrospray ionisation with a triple quadropole mass spectrometer operating in multiple reaction monitoring (MRM) mode. Two MRM transitions of each analyte were established and identification criteria were set up based on the retention time and the ion ratio. The quantification was performed using deuterated internal analytes of cocaine, benzoylecgonine and ecgonine methyl ester. The calibration curves of extracted standards were linear over a working range of 0.001-2.00 mg/kg whole blood for all analytes. The limit of quantification was 0.008 mg/kg; the interday precision (measured by relative standard deviation-%RSD) was less than 10% and the accuracy (BIAS) less than 12% for all analytes in whole blood. Urine samples were estimated semi-quantitatively at a cut-off level of 0.15 mg/kg with an interday precision of 15%. A liquid chromatography mass spectrometric (LC/MS/MS) method has been developed for confirmation and quantification of cocaine and its metabolites (ecgonine methyl ester, benzoylecgonine, norcocaine and ethylene cocaine) in whole blood and semi-quantitative in urine. The method is specific and sensitive and offers thereby an excellent alternative to

  8. Quantitative analysis of multiple fatty acid ethanolamides using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Lin, Lin; Yang, Haifeng; Jones, Peter J H

    2012-12-01

    Fatty acid ethanolamides (FAE) represent a group of lipid signaling molecules associated with many physiological and pharmacological actions; however, low FAE tissue levels pose challenges in terms of analytical characterization. The objective was to develop a competent ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method for analysis of multiple FAE in animal and human tissue samples. Analytes were extracted using lipid-phase and solid-phase extraction procedures. Chromatographic separation was achieved using a gradient elution in 8 min. FAE were quantified by MS/MS in positive electrospray ionization mode. Linearity was shown in lower and higher FAE concentration ranges, with a limit of quantification (LOQ) ≤0.2 ng/ml for FAE including alpha-linolenoylethanolamide (ALEA), arachidonoylethanolamide (AEA), docosahexaenoylethanolamide (DHEA), linoleoylethanolamide (LEA), oleoylethanolamide (OEA) and palmitoylethanolamide (PEA). Accuracy was shown to be between 92.4% and 108.8%, and precision was <10% for all FAE species. In sum, this sensitive and reproducible method can be used to simultaneously determine multiple FAE at low concentrations in order to facilitate further study of the role of FAE on physiological state. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    International Nuclear Information System (INIS)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-01-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  10. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    Energy Technology Data Exchange (ETDEWEB)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis [University Hospital of Strasbourg, Department of Radiology, Strasbourg (France)

    2014-04-15

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  11. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI.

    Science.gov (United States)

    Bierry, Guillaume; Venkatasamy, Aïna; Kremer, Stéphane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-04-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation ("CT edema") on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84%, specificity of 97%, and accuracy of 95%, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation.

  12. A simultaneous screening and quantitative method for the multiresidue analysis of pesticides in spices using ultra-high performance liquid chromatography-high resolution (Orbitrap) mass spectrometry.

    Science.gov (United States)

    Goon, Arnab; Khan, Zareen; Oulkar, Dasharath; Shinde, Raviraj; Gaikwad, Suresh; Banerjee, Kaushik

    2018-01-12

    A novel screening and quantitation method is reported for non-target multiresidue analysis of pesticides using ultra-HPLC-quadrupole-Orbitrap mass spectrometry in spice matrices, including black pepper, cardamom, chili, coriander, cumin, and turmeric. The method involved sequential full-scan (resolution = 70,000), and variable data independent acquisition (vDIA) with nine consecutive fragmentation events (resolution = 17,500). Samples were extracted by the QuEChERS method. The introduction of an SPE-based clean-up step through hydrophilic-lipophilic-balance (HLB) cartridges proved advantageous in minimizing the false negatives. For coriander, cumin, chili, and cardamom, the screening detection limit was largely at 2 ng/g, while it was 5 ng/g for black pepper, and turmeric. When the method was quantitatively validated for 199 pesticides, the limit of quantification (LOQ) was mostly at 10 ng/g (excluding black pepper, and turmeric with LOQ = 20 ng/g) with recoveries within 70-120%, and precision-RSDs <20%. Furthermore, the method allowed the identification of suspected non-target analytes through retrospective search of the accurate mass of the compound-specific precursor and product ions. Compared to LC-MS/MS, the quantitative performance of this Orbitrap-MS method had agreements in residue values between 78-100%. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  14. An Analysis of the Effect of Quantitative and Qualitative Admissions Factors in Determining Student Performance at the U.S. Naval Academy

    National Research Council Canada - National Science Library

    Phillips, Barton

    2004-01-01

    .... The Candidate Multiple (CM) is the quantitative input to the admissions process derived from a statistics-based scoring model anchored in proven high school performance measures such as the SAT and high school GPA...

  15. Quantitative analysis of pharmaceuticals in biological fluids using high-performance liquid chromatography coupled to mass spectrometry: a review.

    Science.gov (United States)

    Plumb, R S; Dear, G J; Mallett, D N; Higton, D M; Pleasance, S; Biddlecombe, R A

    2001-01-01

    1. The development of bio-analysis of drug molecules over the last 10 years is reviewed, focusing on advances in sample preparation, liquid chromatography and detection. 2. Developments have led to improvements in detection sensitivity, enhancements in specificity and increased capacity. 3. Emerging technologies such as monolithic column chromatography and miniaturized chip-based systems are discussed.

  16. Quantitative Analysis of Ingenol in Euphorbia species via Validated Isotope Dilution Ultra-high Performance Liquid Chromatography Tandem Mass Spectrometry

    Czech Academy of Sciences Publication Activity Database

    Béres, T.; Dragull, K.; Pospíšil, Jiří; Tarkowská, Danuše; Dančák, M.; Bíba, Ondřej; Tarkowski, P.; Doležal, K.; Strnad, Miroslav

    2018-01-01

    Roč. 29, č. 1 (2018), s. 23-29 ISSN 0958-0344 R&D Projects: GA ČR GA17-14007S; GA MŠk(CZ) LO1204 Institutional support: RVO:61389030 Keywords : Euphorbia genus * ingenol * isotope-dilution method * mass spectrometry * ultra-high performance liquid chromatography Subject RIV: FD - Oncology ; Hematology OBOR OECD: Analytical chemistry Impact factor: 2.292, year: 2016

  17. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  18. Methodology for quantitative evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.

    1981-01-01

    Of various approaches that might be taken to the diagnostic performance evaluation problem, Receiver Operating Characteristic (ROC) analysis holds great promise. Further development of the methodology for a unified, objective, and meaningful approach to evaluating the usefulness of medical imaging procedures is done by consideration of statistical significance testing, optimal sequencing of correlated studies, and analysis of observer performance

  19. Qualitative and Quantitative Analysis of Rhizoma Smilacis glabrae by Ultra High Performance Liquid Chromatography Coupled with LTQ OrbitrapXL Hybrid Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Shao-Dan Chen

    2014-07-01

    Full Text Available Rhizoma Smilacis glabrae, a traditional Chinese medicine (TCM as well as a functional food, has been commonly used for detoxification treatments, relieving dampness and as a diuretic. In order to quickly define the chemical profiles and control the quality of Smilacis glabrae, ultra high performance liquid chromatography coupled with electrospray ionization hybrid linear trap quadrupole orbitrap mass spectrometry (UHPLC-ESI/LTQ-Orbitrap-MS was applied for simultaneous identification and quantification of its bioactive constituents. A total of 56 compounds, including six new compounds, were identified or tentatively deduced on the basis of their retention behaviors, mass spectra, or by comparison with reference substances and literature data. The identified compounds belonged to flavonoids, phenolic acids and phenylpropanoid glycosides. In addition, an optimized UHPLC-ESI/LTQ-Orbitrap-MS method was established for quantitative determination of six marker compounds from five batches. The validation of the method, including linearity, sensitivity (LOQ, precision, repeatability and spike recoveries, was carried out and demonstrated to be satisfied the requirements of quantitative analysis. The results suggested that the established method would be a powerful and reliable analytical tool for the characterization of multi-constituent in complex chemical system and quality control of TCM.

  20. Qualitative and quantitative analysis of an alkaloid fraction from Piper longum L. using ultra-high performance liquid chromatography-diode array detector-electrospray ionization mass spectrometry.

    Science.gov (United States)

    Li, Kuiyong; Fan, Yunpeng; Wang, Hui; Fu, Qing; Jin, Yu; Liang, Xinmiao

    2015-05-10

    In a previous research, an alkaloid fraction and 18 alkaloid compounds were prepared from Piper longum L. by series of purification process. In this paper, a qualitative and quantitative analysis method using ultra-high performance liquid chromatography-diode array detector-mass spectrometry (UHPLC-DAD-MS) was developed to evaluate the alkaloid fraction. Qualitative analysis of the alkaloid fraction was firstly completed by UHPLC-DAD method and 18 amide alkaloid compounds were identified. A further qualitative analysis of the alkaloid fraction was accomplished by UHPLC-MS/MS method. Another 25 amide alkaloids were identified according to their characteristic ions and neutral losses. At last, a quantitative method for the alkaloid fraction was established using four marker compounds including piperine, pipernonatine, guineensine and N-isobutyl-2E,4E-octadecadienamide. After the validation of this method, the contents of above four marker compounds in the alkaloid fraction were 57.5mg/g, 65.6mg/g, 17.7mg/g and 23.9mg/g, respectively. Moreover, the relative response factors of other three compounds to piperine were calculated. A comparative study between external standard quantification and relative response factor quantification proved no remarkable difference. UHPLC-DAD-MS method was demonstrated to be a powerful tool for the characterization of the alkaloid fraction from P. longum L. and the result proved that the quality of alkaloid fraction was efficiently improved after appropriate purification. Copyright © 2015. Published by Elsevier B.V.

  1. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  2. [Qualitative and quantitative analysis of amygdalin and its metabolite prunasin in plasma by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry].

    Science.gov (United States)

    Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi

    2014-06-01

    A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.

  3. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  4. Quantitative analysis for the determination of aluminum percentage and detonation performance of aluminized plastic bonded explosives by laser-induced breakdown spectroscopy

    Science.gov (United States)

    Rezaei, A. H.; Keshavarz, M. H.; Kavosh Tehrani, M.; Darbani, S. M. R.

    2018-06-01

    The aluminized plastic-bonded explosive (PBX) is a composite material in which solid explosive particles are dispersed in a polymer matrix, which includes three major components, i.e. polymeric binder, metal fuel (aluminum) and nitramine explosive. This work introduces a new method on the basis of the laser-induced breakdown spectroscopy (LIBS) technique in air and argon atmospheres to investigate the determination of aluminum content and detonation performance of aluminized PBXs. Plasma emissions of aluminized PBXs are recorded where atomic lines of Al, C and H as well as molecular bands of AlO and CN are identified. The experimental results demonstrate that a good discrimination and separation between the aluminized PBXs is possible using LIBS and principle component analysis, although they have similar atomic composition. Relative intensity of the AlO/Al is used to determine aluminum percentage of the aluminized PBXs. The obtained quantitative calibration curve using the relative intensity of the AlO/Al is better than the resulting calibration curve using only the intensity of Al. By using the LIBS method and the measured intensity ratio of CN/C, an Al content of 15% is found to be the optimum value in terms of velocity of detonation of the RDX/Al/HTPB standard samples.

  5. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  6. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  7. Simultaneous determination of linagliptin and metformin by reverse phase-high performance liquid chromatography method: An application in quantitative analysis of pharmaceutical dosage forms

    Directory of Open Access Journals (Sweden)

    Prathyusha Vemula

    2015-01-01

    Full Text Available To enhance patient compliance toward treatment in diseases like diabetes, usually a combination of drugs is prescribed. Therefore, an anti-diabetic fixed-dose combination of 2.5 mg of linagliptin 500 mg of metformin was taken for simultaneous estimation of both the drugs by reverse phase-high performance liquid chromatography (RP-HPLC method. The present study aimed to develop a simple and sensitive RP-HPLC method for the simultaneous determination of linagliptin and metformin in pharmaceutical dosage forms. The chromatographic separation was designed and evaluated by using linagliptin and metformin working standard and sample solutions in the linearity range. Chromatographic separation was performed on a C 18 column using a mobile phase of 70:30 (v/v mixture of methanol and 0.05 M potassium dihydrogen orthophosphate (pH adjusted to 4.6 with orthophosphoric acid delivered at a flow rate of 0.6 mL/min and UV detection at 267 nm. Linagliptin and metformin shown linearity in the range of 2-12 μg/mL and 400-2400 μg/mL respectively with correlation co-efficient of 0.9996 and 0.9989. The resultant findings analyzed for standard deviation (SD and relative standard deviation to validate the developed method. The retention time of linagliptin and metformin was found to be 6.3 and 4.6 min and separation was complete in <10 min. The method was validated for linearity, accuracy and precision were found to be acceptable over the linearity range of the linagliptin and metformin. The method was found suitable for the routine quantitative analysis of linagliptin and metformin in pharmaceutical dosage forms.

  8. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  9. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  10. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  11. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  12. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  13. Quantitative analysis of flavonols, flavones, and flavanones in fruits, vegetables and beverages by high-performance liquid chromatography with photo-diode array and mass spectrometric detection

    DEFF Research Database (Denmark)

    Justesen, U.; Knuthsen, Pia; Leth, Torben

    1998-01-01

    after acid hydrolysis of freeze-dried food material. Identification was based on retention time, UV and mass spectra by comparison with commercial standards, and the UV peak areas were used for quantitation of the flavonoid contents. Examples of HPLC-MS analyses of orange pulp, tomato, and apple......A high-performance liquid chromatographic (HPLC) separation method viith photo-diode array (PDA) and mass spectrometric (MS) detection was developed to determine and quantify flavonols, flavones, and flavanones in fruits, vegetables and beverages. The compounds were analysed as aglycones, obtained...

  14. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  15. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  16. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  17. Academic and Occupational Performance: A Quantitative Synthesis.

    Science.gov (United States)

    Samson, Gordon E.; And Others

    1984-01-01

    A synthesis of results from 35 studies of the association between academic and occupational performance in various fields indicated that academic indicators such as grades and test scores account for only 2.4 percent of the variance in occupational performance criteria such as income, job satisfaction, and effectiveness ratings. (Author/BW)

  18. Quantitative analysis of a novel antimicrobial peptide in rat plasma by ultra performance liquid chromatography–tandem mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ruo-Wen Zhang

    2011-08-01

    Full Text Available We described the first results of a quantitative ultra performance liquid chromatography–tandem mass spectrometry method for a novel antimicrobial peptide (phylloseptin, PSN-1. Chromatographic separation was accomplished on a Waters bridged ethyl hybrid (BEH C18 (50 mm×2.1 mm, 1.7 μm column with acetonitrile–water (25:75, v/v as isocratic mobile phase. Mass spectrometry detection was performed in the positive electrospray ionization mode and by monitoring of the transitions at m/z 679.6/120, 509.6/120 (PSN-1 and m/z 340.7/165 (Thymopentin, IS. Protein precipitation was investigated and the recovery was satisfactory (above 82%. The method was shown to be reproducible and reliable with intra-day precision below 5.3%, inter-day precision below 14.2%, and linear range from 0.02 to 2 μg/mL with r>0.994. The method was successfully applied to a pharmacokinetic study of PSN-1 in rats after intravenous administration. Keywords: Antimicrobial peptide, Phylloseptin, Ultra performance liquid chromatography–tandem mass spectrometry, Pharmacokinetic

  19. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  20. Quantitative Analysis and Comparison of Four Major Flavonol Glycosides in the Leaves of Toona sinensis (A. Juss.) Roemer (Chinese Toon) from Various Origins by High-Performance Liquid Chromatography-Diode Array Detector and Hierarchical Clustering Analysis

    Science.gov (United States)

    Sun, Xiaoxiang; Zhang, Liting; Cao, Yaqi; Gu, Qinying; Yang, Huan; Tam, James P.

    2016-01-01

    Background: Toona sinensis (A. Juss.) Roemer is an endemic species of Toona genus native to Asian area. Its dried leaves are applied in the treatment of many diseases; however, few investigations have been reported for the quantitative analysis and comparison of major bioactive flavonol glycosides in the leaves harvested from various origins. Objective: To quantitatively analyze four major flavonol glycosides including rutinoside, quercetin-3-O-β-D-glucoside, quercetin-3-O-α-L-rhamnoside, and kaempferol-3-O-α-L-rhamnoside in the leaves from different production sites and classify them according to the content of these glycosides. Materials and Methods: A high-performance liquid chromatography-diode array detector (HPLC-DAD) method for their simultaneous determination was developed and validated for linearity, precision, accuracy, stability, and repeatability. Moreover, the method established was then employed to explore the difference in the content of these four glycosides in raw materials. Finally, a hierarchical clustering analysis was performed to classify 11 voucher specimens. Results: The separation was performed on a Waters XBridge Shield RP18 column (150 mm × 4.6 mm, 3.5 μm) kept at 35°C, and acetonitrile and H2O containing 0.30% trifluoroacetic acid as mobile phase was driven at 1.0 mL/min during the analysis. Ten microliters of solution were injected and 254 nm was selected to monitor the separation. A strong linear relationship between the peak area and concentration of four analytes was observed. And, the method was also validated to be repeatable, stable, precise, and accurate. Conclusion: An efficient and reliable HPLC-DAD method was established and applied in the assays for the samples from 11 origins successfully. Moreover, the content of those flavonol glycosides varied much among different batches, and the flavonoids could be considered as biomarkers to control the quality of Chinese Toon. SUMMARY Four major flavonol glycosides in the leaves

  1. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  2. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  3. Qualitative and quantitative analysis of the saponins in Panax notoginseng leaves using ultra-performance liquid chromatography coupled with time-of-flight tandem mass spectrometry and high performance liquid chromatography coupled with UV detector.

    Science.gov (United States)

    Liu, Fang; Ma, Ni; He, Chengwei; Hu, Yuanjia; Li, Peng; Chen, Meiwan; Su, Huanxing; Wan, Jian-Bo

    2018-04-01

    Panax notoginseng leaves (PNL) exhibit extensive activities, but few analytical methods have been established to exclusively determine the dammarane triterpene saponins in PNL. Ultra-performance liquid chromatography coupled with time-of-flight mass spectrometry (UPLC/Q-TOF MS) and HPLC-UV methods were developed for the qualitative and quantitative analysis of ginsenosides in PNL, respectively. Extraction conditions, including solvents and extraction methods, were optimized, which showed that ginsenosides Rc and Rb3, the main components of PNL, are transformed to notoginsenosides Fe and Fd, respectively, in the presence of water, by removing a glucose residue from position C-3 via possible enzymatic hydrolysis. A total of 57 saponins were identified in the methanolic extract of PNL by UPLC/Q-TOF MS. Among them, 19 components were unambiguously characterized by their reference substances. Additionally, seven saponins of PNL-ginsenosides Rb1, Rc, Rb2, and Rb3, and notoginsenosides Fc, Fe, and Fd-were quantified using the HPLC-UV method after extraction with methanol. The separation of analytes, particularly the separation of notoginsenoside Fc and ginsenoside Rc, was achieved on a Zorbax ODS C8 column at a temperature of 35°C. This developed HPLC-UV method provides an adequate linearity ( r 2  > 0.999), repeatability (relative standard deviation, RSD PNL. These findings are beneficial to the quality control of PNL and its relevant products.

  4. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  5. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  6. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  7. Quantitative analysis of the eight major compounds in the Samsoeum using a high-performance liquid chromatography coupled with diode array detection and electrospray ionization mass spectrometer

    Science.gov (United States)

    Weon, Jin Bae; Yang, Hye Jin; Lee, Bohyoung; Ma, Jin Yeul; Ma, Choong Je

    2015-01-01

    Background: Samsoeum was traditionally used for treatment of a respiratory disease. Objective: The simultaneous determination of eight major compounds, ginsenoside Rg3, caffeic acid, puerarin, costunolide, hesperidin, naringin, glycyrrhizin, and 6-gingerol in the Samsoeum using a high-performance liquid chromatography (HPLC) coupled with diode array detection (DAD) and an electrospray ionization mass spectrometer was developed for an accurate and reliable quality assessment. Materials and Methods: Eight compounds were qualitative identified based on their mass spectra and by comparing with standard compounds and quantitative analyzed by HPLC-DAD. Separation of eight compounds was carried out on a LUNA C18 column (S-5 μm, 4.6 mm i.d. ×250 mm) with gradient elution composed of acetonitrile and 0.1% trifluoroacetic acid. Results: The data showed good linearity (R2 > 0.9996). The limits of detection and the limits of quantification were <0.53 μg and 1.62 μg, respectively. Inter- and Intra-day precisions (expressed as relative standard deviation values) were within 1.94% and 1.91%, respectively. The recovery of the method was in the range of 94.24–107.90%. Conclusion: The established method is effective and could be applied to quality control of Samsoeum. PMID:25829771

  8. Is Diagnostic Performance of Quantitative 2D-Shear Wave Elastography Optimal for Clinical Classification of Benign and Malignant Thyroid Nodules?: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Nattabi, Haliimah A; Sharif, Norhafidzah M; Yahya, Noorazrul; Ahmad, Rozilawati; Mohamad, Mazlyfarina; Zaki, Faizah M; Yusoff, Ahmad N

    2017-10-17

    This study is a dedicated 2D-shear wave elastography (2D-SWE) review aimed at systematically eliciting up-to-date evidence of its clinical value in differential diagnosis of benign and malignant thyroid nodules. PubMed, Web of Science, and Scopus databases were searched for studies assessing the diagnostic value of 2D-SWE for thyroid malignancy risk stratification published until December 2016. The retrieved titles and abstracts were screened and evaluated according to the predefined inclusion and exclusion criteria. Methodological quality of the studies was assessed using the Quality Assessment of Studies of Diagnostic Accuracy included in Systematic Review 2 (QUADAS-2) tool. Extracted 2D-SWE diagnostic performance data were meta-analyzed to assess the summary sensitivity, specificity, and area under the receiver operating characteristic curve. After stepwise review, 14 studies in which 2D-SWE was used to evaluate 2851 thyroid nodules (1092 malignant, 1759 benign) from 2139 patients were selected for the current study. Study quality on QUADAS-2 assessment was moderate to high. The summary sensitivity, specificity and area under the receiver operating characteristic curve of 2D-SWE for differential diagnosis of benign and malignant thyroid nodules were 0.66 (95% confidence interval [CI]: 0.64-0.69), 0.78 (CI: 0.76-0.80), and 0.851 (Q* = 0.85), respectively. The pooled diagnostic odds ratio, negative likelihood ratio, and positive likelihood ratio were 12.73 (CI: 8.80-18.43), 0.31 (CI: 0.22-0.44), and 3.87 (CI: 2.83-5.29), respectively. Diagnostic performance of quantitative 2D-SWE for malignancy risk stratification of thyroid nodules is suboptimal with mediocre sensitivity and specificity, contrary to earlier reports of excellence. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  9. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  10. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  11. Qualitative and quantitative analysis of the saponins in Panax notoginseng leaves using ultra-performance liquid chromatography coupled with time-of-flight tandem mass spectrometry and high performance liquid chromatography coupled with UV detector

    Directory of Open Access Journals (Sweden)

    Fang Liu

    2018-04-01

    Full Text Available Background: Panax notoginseng leaves (PNL exhibit extensive activities, but few analytical methods have been established to exclusively determine the dammarane triterpene saponins in PNL. Methods: Ultra-performance liquid chromatography coupled with time-of-flight mass spectrometry (UPLC/Q-TOF MS and HPLC-UV methods were developed for the qualitative and quantitative analysis of ginsenosides in PNL, respectively. Results: Extraction conditions, including solvents and extraction methods, were optimized, which showed that ginsenosides Rc and Rb3, the main components of PNL, are transformed to notoginsenosides Fe and Fd, respectively, in the presence of water, by removing a glucose residue from position C-3 via possible enzymatic hydrolysis. A total of 57 saponins were identified in the methanolic extract of PNL by UPLC/Q-TOF MS. Among them, 19 components were unambiguously characterized by their reference substances. Additionally, seven saponins of PNL—ginsenosides Rb1, Rc, Rb2, and Rb3, and notoginsenosides Fc, Fe, and Fd—were quantified using the HPLC-UV method after extraction with methanol. The separation of analytes, particularly the separation of notoginsenoside Fc and ginsenoside Rc, was achieved on a Zorbax ODS C8 column at a temperature of 35°C. This developed HPLC-UV method provides an adequate linearity (r2>0.999, repeatability (relative standard deviation, RSD < 2.98%, and inter- and intraday variations (RSD < 4.40% with recovery (98.7–106.1% of seven saponins concerned. This validated method was also conducted to determine seven components in 10 batches of PNL. Conclusion: These findings are beneficial to the quality control of PNL and its relevant products. Keywords: ginsenoside transformation, notoginsenoside Fd, notoginsenoside Fe, Panax notoginseng leaves, UPLC/Q-TOF MS

  12. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  13. Multiplex real-time quantitative PCR, microscopy and rapid diagnostic immuno-chromatographic tests for the detection of Plasmodium spp: performance, limit of detection analysis and quality assurance

    Directory of Open Access Journals (Sweden)

    Ralevski Filip

    2009-12-01

    Full Text Available Abstract Background Accurate laboratory diagnosis of malaria species in returning travelers is paramount in the treatment of this potentially fatal infectious disease. Materials and methods A total of 466 blood specimens from returning travelers to Africa, Asia, and South/Central America with suspected malaria infection were collected between 2007 and 2009 at the reference public health laboratory. These specimens were assessed by reference microscopy, multipex real-time quantitative polymerase chain reaction (QPCR, and two rapid diagnostic immuno-chromatographic tests (ICT in a blinded manner. Key clinical laboratory parameters such as limit of detection (LOD analysis on clinical specimens by parasite stage, inter-reader variability of ICTs, staffing implications, quality assurance and cost analysis were evaluated. Results QPCR is the most analytically sensitive method (sensitivity 99.41%, followed by CARESTART (sensitivity 88.24%, and BINAXNOW (sensitivity 86.47% for the diagnosis of malaria in returning travelers when compared to reference microscopy. However, microscopy was unable to specifically identify Plasmodia spp. in 18 out of 170 positive samples by QPCR. Moreover, the 17 samples that were negative by microscopy and positive by QPCR were also positive by ICTs. Quality assurance was achieved for QPCR by exchanging a blinded proficiency panel with another reference laboratory. The Kappa value of inter-reader variability among three readers for BINAXNOW and CARESTART was calculated to be 0.872 and 0.898 respectively. Serial dilution studies demonstrated that the QPCR cycle threshold correlates linearly with parasitemia (R2 = 0.9746 in a clinically relevant dynamic range and retains a LOD of 11 rDNA copies/μl for P. falciparum, which was several log lower than reference microscopy and ICTs. LOD for QPCR is affected not only by parasitemia but the parasite stage distribution of each clinical specimen. QPCR was approximately 6-fold more

  14. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  15. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  16. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  17. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  18. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  19. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  20. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  1. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  2. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  3. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  4. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  5. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  6. Improving Student Retention and Performance in Quantitative Courses Using Clickers

    Science.gov (United States)

    Liu, Wallace C.; Stengel, Donald N.

    2011-01-01

    Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…

  7. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  8. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  9. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  10. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  11. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  12. Analysis of genetic variants of coat colour loci and their influence on the coat colour phenotype and quantitative performance traits in the pig

    OpenAIRE

    Siebel, Krista

    2010-01-01

    The influence of four single coat colour loci (KIT, MC1R, TYR, ASP) on the coat colour phenotype and performance traits in the pig have been investigated in a resource population. The research revealed an unknown genotype for the white phenotype in the pig. The influence of the Agouti locus on the coat colour phenotype has been suggested. An influence of the coat colour loci KIT on growth performance traits and MC1R on body fatness could be demonstrated.

  13. Comparison of performance and quantitative descriptive analysis sensory profiling and its relationship to consumer liking between the artisanal cheese producers panel and the descriptive trained panel.

    Science.gov (United States)

    Ramírez-Rivera, Emmanuel de Jesús; Díaz-Rivera, Pablo; Guadalupe Ramón-Canul, Lorena; Juárez-Barrientos, José Manuel; Rodríguez-Miranda, Jesús; Herman-Lara, Erasmo; Prinyawiwatkul, Witoon; Herrera-Corredor, José Andrés

    2018-04-25

    The aim of this research was to compare the performance and sensory profiling of a panel of artisanal cheese producers against a trained panel and their relationship to consumer liking (external preference mapping). Performance was analyzed statistically at an individual level using the Fisher's test (F) for discrimination, the mean square error for repeatability, and Manhattan plots for visualizing the intra-panel homogeneity. At group level, performance was evaluated using ANOVA. External preference mapping technique was applied to determine the efficiency of each sensory profile. Results showed that the producers panel was discriminant and repetitive with a performance similar to that of the trained panel. Manhattan plots showed that the performance of artisanal cheese producers was more homogeneous than trained panelists. The correlation between sensory profiles (Rv = 0.95) demonstrated similarities in the generation and use of sensory profiles. The external preference maps generated individually with the profiles of each panel were also similar. Recruiting individuals familiar with the production of artisanal cheeses as panelists is a viable strategy for sensory characterization of artisanal cheeses within their context of origin because their results were similar to those from the trained panel and can be correlated with consumer liking data. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  15. Determination of the Antibiotic Oxytetracycline in Commercial Milk by Solid-Phase Extraction: A High-Performance Liquid Chromatography (HPLC) Experiment for Quantitative Instrumental Analysis

    Science.gov (United States)

    Mei-Ratliff, Yuan

    2012-01-01

    Trace levels of oxytetracylcine spiked into commercial milk samples are extracted, cleaned up, and preconcentrated using a C[subscript 18] solid-phase extraction column. The extract is then analyzed by a high-performance liquid chromatography (HPLC) instrument equipped with a UV detector and a C[subscript 18] column (150 mm x 4.6 mm x 3.5 [mu]m).…

  16. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  17. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  18. Application of Silver Ion High-Performance Liquid Chromatography for Quantitative Analysis of Selected n-3 and n-6 PUFA in Oil Supplements.

    Science.gov (United States)

    Czajkowska-Mysłek, Anna; Siekierko, Urszula; Gajewska, Magdalena

    2016-04-01

    The aim of this study was to develop a simple method for simultaneous determination of selected cis/cis PUFA-LNA (18:2), ALA (18:3), GLA (18:3), EPA (20:5), and DHA (22:6) by silver ion high-performance liquid chromatography coupled to a diode array detector (Ag-HPLC-DAD). The separation was performed on three Luna SCX Silver Loaded columns connected in series maintained at 10 °C with isocratic elution by 1% acetonitrile in n-hexane. The applied chromatographic system allowed a baseline separation of standard mixture of n-3 and n-6 fatty acid methyl esters containing LNA, DHA, and EPA and partial separation of ALA and GLA positional isomers. The method was validated by means of linearity, precision, stability, and recovery. Limits of detection (LOD) for considered PUFA standard solutions ranged from 0.27 to 0.43 mg L(-1). The developed method was used to evaluate of n-3 and n-6 fatty acids contents in plant and fish softgel oil capsules, results were compared with reference GC-FID based method.

  19. Qualitative and Quantitative Analysis of Volatile Components of Zhengtian Pills Using Gas Chromatography Mass Spectrometry and Ultra-High Performance Liquid Chromatography.

    Science.gov (United States)

    Liu, Cui-Ting; Zhang, Min; Yan, Ping; Liu, Hai-Chan; Liu, Xing-Yun; Zhan, Ruo-Ting

    2016-01-01

    Zhengtian pills (ZTPs) are traditional Chinese medicine (TCM) which have been commonly used to treat headaches. Volatile components of ZTPs extracted by ethyl acetate with an ultrasonic method were analyzed by gas chromatography mass spectrometry (GC-MS). Twenty-two components were identified, accounting for 78.884% of the total components of volatile oil. The three main volatile components including protocatechuic acid, ferulic acid, and ligustilide were simultaneously determined using ultra-high performance liquid chromatography coupled with diode array detection (UHPLC-DAD). Baseline separation was achieved on an XB-C18 column with linear gradient elution of methanol-0.2% acetic acid aqueous solution. The UHPLC-DAD method provided good linearity (R (2) ≥ 0.9992), precision (RSD components, protocatechuic acid, ferulic acid, and ligustilide, in 13 batches of ZTPs, which is suitable for discrimination and quality assessment of ZTPs.

  20. Performance of the linear ion trap Orbitrap mass analyzer for qualitative and quantitative analysis of drugs of abuse and relevant metabolites in sewage water.

    Science.gov (United States)

    Bijlsma, Lubertus; Emke, Erik; Hernández, Félix; de Voogt, Pim

    2013-03-20

    This work illustrates the potential of liquid chromatography coupled to a hybrid linear ion trap Fourier Transform Orbitrap mass spectrometer for the simultaneous identification and quantification of 24 drugs of abuse and relevant metabolites in sewage water. The developed methodology consisted of automatic solid-phase extraction using Oasis HLB cartridges, chromatographic separation of the targeted drugs, full-scan accurate mass data acquisition under positive electrospray ionization mode over an m/z range of 50-600Da at a resolution of 30,000 FWHM and simultaneous MS(n) measurements to obtain information of fragment ions generated in the linear ion trap. Accurate mass of the protonated molecule, together with at least one nominal mass product ion and retention time allowed the confident identification of the compounds detected in these complex matrices. In addition to the highly reliable qualitative analysis, Orbitrap analyzer also proved to have satisfactory potential for quantification at sub-ppb analyte levels, a possibility that has been very little explored in the literature until now. The limits of quantification ranged from 4 to 68ngL(-1) in influent sewage water, and from 2 to 35ngL(-1) in effluent, with the exception of MDA, morphine and THC that presented higher values as a consequence of the high ionization suppression in this type of samples. Satisfactory recoveries (70-120%) and precision (abuse could be identified and quantified, mainly MDMA, benzoylecgonine, codeine, oxazepam and temazepam. Orbitrap also showed potential for retrospective investigation of ketamine metabolites in the samples without the need of additional analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  2. Quantitative Analysis of Bioactive Compounds In Extract and Fraction of Star Fruit (Averrhoa carambola L. Leaves Using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    Nanang Yunarto

    2017-05-01

    Full Text Available Starfruit (Averrhoa carambola L. is potential as raw material for medicine, native in tropic areas, including Indonesia. According to other study report, starfruit leaves containing flavonoids apigenin and quercetin as potential anti-inflammatory and anticancer agents. The raw material for the drug in Indonesia mostly obtained through imports from other countries. In order to support the independence of traditional medicine raw materials, it is important to standardize the quality of traditional medicine raw materials, in this case is star fruit leaves by High Performance Liquid Chromatography (HPLC method. The sample used is star fruit leaves extract obtained from maceration process using ethanol 70%; water fraction, ethyl acetate and hexane fractions obtained from fractionation process of the ethanolic extract. Physical parameters analyzed in sample include appearance, color, odor, taste, extract yield, water content, loss of drying, total ash content, residual solvent. Chemical parameters analyzed include apigenin and quercetin contents. The results shows that star fruit leaves used in this study meet the standards of Indonesian Herbal Pharmacopoeia with highest apigenin and quercetin content are in ethyl acetate fraction.

  3. Simultaneous qualitative assessment and quantitative analysis of flavonoids in various tissues of lotus (Nelumbo nucifera) using high performance liquid chromatography coupled with triple quad mass spectrometry.

    Science.gov (United States)

    Chen, Sha; Fang, Linchuan; Xi, Huifen; Guan, Le; Fang, Jinbao; Liu, Yanling; Wu, Benhong; Li, Shaohua

    2012-04-29

    Flavonoid composition and concentration were investigated in 12 different tissues of 'Ti-1' lotus (Nelumbo nucifera) by high performance liquid chromatography equipped with photodiode array detection tandem electrospray ionization mass spectrometry (HPLC-DAD-ESI-MS(n)). A total of 20 flavonoids belonging to six groups (myricetin, quercetin, kaempferol, isohamnetin, diosmetin derivatives) were separated and identified. Myricetin 3-O-galactoside, myricetin 3-O-glucuronide, isorhamnetin 3-O-glucuronide and free aglycone diometin (3',5,7-trihydroxy-4'-methoxyflavone) were first reported in lotus. Flavonoid composition varied largely with tissue type, and diverse compounds (5-15) were found in leaf and flower stalks, flower pistils, seed coats and embryos. Flower tissues including flower petals, stamens, pistils, and, especially, reproductive tissue fruit coats had more flavonoid compounds (15-17) than leaves (12), while no flavonoids were detectable in seed kernels. The flavonoid content of seed embryos was high, 730.95 mg 100g(-1) DW (dry weight). As regards the other tissues, mature leaf pulp (771.79 mg 100 g(-1) FW (fresh weight)) and young leaves (650.67 mg 100 g(-1) FW) had higher total flavonoid amount than flower stamens (359.45 mg 100 g(-1) FW) and flower petals (342.97 mg 100g(-1) FW), while leaf stalks, flower stalks and seed coats had much less total flavonoid (less than 40 mg 100 g(-1) FW). Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Quantitative analysis of the major constituents in Chinese medicinal preparation SuoQuan formulae by ultra fast high performance liquid chromatography/quadrupole tandem mass spectrometry.

    Science.gov (United States)

    Chen, Feng; Li, Hai-Long; Li, Yong-Hui; Tan, Yin-Feng; Zhang, Jun-Qing

    2013-07-30

    The SuoQuan formulae containing Fructus Alpiniae Oxyphyllae has been used to combat the urinary incontinence symptoms including frequency, urgency and nocturia for hundreds of years in China. However, the chemical information was not well characterized. The quality control marker constituent only focused on one single compound in the current Chinese Pharmacopeia. Hence it is prudent to identify and quantify the main constituents in this herbal product. This study aimed to analyze the main constituents using ultra-fast performance liquid chromatography coupled to tandem mass spectrometry (UFLC-MS/MS). Fourteen phytochemicals originated from five chemical classes constituents were identified by comparing the molecular mass, fragmentation pattern and retention time with those of the reference standards. A newly developed UFLC-MS/MS was validated demonstrating that the new assay was valid, reproducible and reliable. This method was successfully applied to simultaneously quantify the fourteen phytochemicals. Notably, the content of these constituents showed significant differences in three pharmaceutical preparations. The major constituent originated from each of chemical class was isolinderalactone, norisoboldine, nootkatone, yakuchinone A and apigenin-4',7-dimethylther, respectively. The variation among these compounds was more than 1000 times. Furthermore, the significant content variation between the two different Suoquan pills was also observed. The proposed method is sensitive and reliable; hence it can be used to analyze a variety of SuoQuan formulae products produced by different pharmaceutical manufacturers.

  5. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  6. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  7. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  8. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  9. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  10. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  11. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  12. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  13. Quantitative aspects of the clinical performance of transverse tripolar spinal cord stimulation

    NARCIS (Netherlands)

    Wesselink, W.A.; Holsheimer, J.; King, Gary W.; Torgerson, Nathan A.; Boom, H.B.K.

    1999-01-01

    A multicenter study was initiated to evaluate the performance of the transverse tripolar system for spinal cord stimulation. Computer modeling had predicted steering of paresthesia with a dual channel stimulator to be the main benefit of the system. The quantitative analysis presented here includes

  14. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. Performance Analysis of MYSEA

    Science.gov (United States)

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  17. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  18. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  19. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  20. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  1. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  2. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  3. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  6. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  7. Quantitative assessment of finger motor performance: Normative data.

    Directory of Open Access Journals (Sweden)

    Alessio Signori

    Full Text Available Finger opposition movements are the basis of many daily living activities and are essential in general for manipulating objects; an engineered glove quantitatively assessing motor performance during sequences of finger opposition movements has been shown to be useful to provide reliable measures of finger motor impairment, even subtle, in subjects affected by neurological diseases. However, the obtained behavioral parameters lack published reference values.To determine mean values for different motor behavioral parameters describing the strategy adopted by healthy people in performing repeated sequences of finger opposition movements, examining associations with gender and age.Normative values for finger motor performance parameters were obtained on a sample of 255 healthy volunteers executing sequences of finger-to-thumb opposition movements, stratified by gender and over a wide range of ages. Touch duration, inter-tapping interval, movement rate, correct sequences (%, movements in advance compared with a metronome (% and inter-hand interval were assessed.Increasing age resulted in decreased movement speed, advance movements with respect to a cue, correctness of sequences, and bimanual coordination. No significant performance differences were found between male and female subjects except for the duration of the finger touch, the interval between two successive touches and their ratio.We report age- and gender-specific normal mean values and ranges for different parameters objectively describing the performance of finger opposition movement sequences, which may serve as useful references for clinicians to identify possible deficits in subjects affected by diseases altering fine hand motor skills.

  8. Prediction of Molar Extinction Coefficients of Proteins and Peptides Using UV Absorption of the Constituent Amino Acids at 214 nm To Enable Quantitative Reverse Phase High-Performance Liquid Chromatography-Mass Spectrometry Analysis

    NARCIS (Netherlands)

    Kuipers, B.J.H.; Gruppen, H.

    2007-01-01

    The molar extinction coefficients of 20 amino acids and the peptide bond were measured at 214 nm in the presence of acetonitrile and formic acid to enable quantitative comparison of peptides eluting from reversed-phase high-performance liquid chromatography, once identified with mass spectrometry

  9. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    OpenAIRE

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them...

  10. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  11. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease.

    Science.gov (United States)

    van Gilst, Merel M; van Mierlo, Petra; Bloem, Bastiaan R; Overeem, Sebastiaan

    2015-10-01

    Many people with Parkinson disease experience "sleep benefit": temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. © 2015 Associated Professional Sleep Societies, LLC.

  12. Quantitative analysis of flavanones from citrus fruits by using mesoporous molecular sieve-based miniaturized solid phase extraction coupled to ultrahigh-performance liquid chromatography and quadrupole time-of-flight mass spectrometry.

    Science.gov (United States)

    Cao, Wan; Ye, Li-Hong; Cao, Jun; Xu, Jing-Jing; Peng, Li-Qing; Zhu, Qiong-Yao; Zhang, Qian-Yun; Hu, Shuai-Shuai

    2015-08-07

    An analytical procedure based on miniaturized solid phase extraction (SPE) and ultrahigh-performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry was developed and validated for determination of six flavanones in Citrus fruits. The mesoporous molecular sieve SBA-15 as a solid sorbent was characterised by Fourier transform-infrared spectroscopy and scanning electron microscopy. Additionally, compared with reported extraction techniques, the mesoporous SBA-15 based SPE method possessed the advantages of shorter analysis time and higher sensitivity. Furthermore, considering the different nature of the tested compounds, all of the parameters, including the SBA-15 amount, solution pH, elution solvent, and the sorbent type, were investigated in detail. Under the optimum condition, the instrumental detection and quantitation limits calculated were less than 4.26 and 14.29ngmL(-1), respectively. The recoveries obtained for all the analytes were ranging from 89.22% to 103.46%. The experimental results suggested that SBA-15 was a promising material for the purification and enrichment of target flavanones from complex citrus fruit samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  15. Learning from Past Classification Errors: Exploring Methods for Improving the Performance of a Deep Learning-based Building Extraction Model through Quantitative Analysis of Commission Errors for Optimal Sample Selection

    Science.gov (United States)

    Swan, B.; Laverdiere, M.; Yang, L.

    2017-12-01

    In the past five years, deep Convolutional Neural Networks (CNN) have been increasingly favored for computer vision applications due to their high accuracy and ability to generalize well in very complex problems; however, details of how they function and in turn how they may be optimized are still imperfectly understood. In particular, their complex and highly nonlinear network architecture, including many hidden layers and self-learned parameters, as well as their mathematical implications, presents open questions about how to effectively select training data. Without knowledge of the exact ways the model processes and transforms its inputs, intuition alone may fail as a guide to selecting highly relevant training samples. Working in the context of improving a CNN-based building extraction model used for the LandScan USA gridded population dataset, we have approached this problem by developing a semi-supervised, highly-scalable approach to select training samples from a dataset of identified commission errors. Due to the large scope this project, tens of thousands of potential samples could be derived from identified commission errors. To efficiently trim those samples down to a manageable and effective set for creating additional training sample, we statistically summarized the spectral characteristics of areas with rates of commission errors at the image tile level and grouped these tiles using affinity propagation. Highly representative members of each commission error cluster were then used to select sites for training sample creation. The model will be incrementally re-trained with the new training data to allow for an assessment of how the addition of different types of samples affects the model performance, such as precision and recall rates. By using quantitative analysis and data clustering techniques to select highly relevant training samples, we hope to improve model performance in a manner that is resource efficient, both in terms of training process

  16. Application of Ultra-High-Performance Liquid Chromatography Coupled with LTQ-Orbitrap Mass Spectrometry for the Qualitative and Quantitative Analysis of Polygonum multiflorum Thumb. and Its Processed Products

    Directory of Open Access Journals (Sweden)

    Teng-Hua Wang

    2015-12-01

    Full Text Available In order to quickly and simultaneously obtain the chemical profiles and control the quality of the root of Polygonum multiflorum Thumb. and its processed form, a rapid qualitative and quantitative method, using ultra-high-performance liquid chromatography coupled with electrospray ionization-linear ion trap-Orbitrap hybrid mass spectrometry (UHPLC-LTQ-Orbitrap MSn has been developed. The analysis was performed within 10 min on an AcQuity UPLC™ BEH C18 column with a gradient elution of 0.1% formic acid-acetonitrile at flow rate of 400 μL/min. According to the fragmentation mechanism and high resolution MSn data, a diagnostic ion searching strategy was used for rapid and tentative identification of main phenolic components and 23 compounds were simultaneously identified or tentatively characterized. The difference in chemical profiles between P. multiflorum and its processed preparation were observed by comparing the ions abundances of main constituents in the MS spectra and significant changes of eight metabolite biomarkers were detected in the P. multiflorum samples and their preparations. In addition, four of the representative phenols, namely gallic acid, trans-2,3,5,4′-tetra-hydroxystilbene-2-O-β-d-glucopyranoside, emodin and emodin-8-O-β-d-glucopyranoside were quantified by the validated UHPLC-MS/MS method. These phenols are considered to be major bioactive constituents in P. multiflorum, and are generally regarded as the index for quality assessment of this herb. The method was successfully used to quantify 10 batches of P. multiflorum and 10 batches of processed P. multiflorum. The results demonstrated that the method is simple, rapid, and suitable for the discrimination and quality control of this traditional Chinese herb.

  17. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  18. Quantitative measures of CRE and its link with organizational performance

    NARCIS (Netherlands)

    Appel - Meulenbroek, H.A.J.A.

    2007-01-01

    For CREM (Corporate Real Estate Management) to be able to deliver added value to its client organisation, it has to know 1) which CRE aspects influence its employees, processes, machinery, visitors, etc, 2) how these aspects influence performance and 3) how to measure and manage them. Analysis of

  19. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  1. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  2. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  3. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  4. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  5. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    Science.gov (United States)

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  6. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  7. Implementing quantitative analysis and its complement

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Nelson, W.R.; Shepherd, J.C.

    1982-01-01

    This paper presents an application of risk analysis for the evaluation of nuclear reactor facility operation. Common cause failure analysis (CCFA) techniques to identify potential problem areas are discussed. Integration of CCFA and response trees, a particular form of the path sets of a success tree, to gain significant insight into the operation of the facility is also demonstrated. An example illustrating the development of the risk analysis methodology, development of the fault trees, generation of response trees, and evaluation of the CCFA is presented to explain the technique

  8. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  9. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  10. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  11. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  12. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  13. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  14. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  15. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  16. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  17. Quantitation and gompertzian analysis of tumor growth

    DEFF Research Database (Denmark)

    Rygaard, K; Spang-Thomsen, M

    1998-01-01

    to transform the experimental data into useful growth curves. A transformed Gompertz function is used as the basis for calculating relevant parameters pertaining to tumor growth and response to therapy. The calculations are facilitated by use of a computer program which performs the necessary calculations...... and presents the growth data in graphic form....

  18. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  19. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  20. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  1. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  2. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  3. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  4. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    Science.gov (United States)

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  5. Performance analysis in saber.

    Science.gov (United States)

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  6. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  7. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  8. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  9. Quantitative analysis of deuterium by gas chromatography

    International Nuclear Information System (INIS)

    Isomura, Shohei; Kaetsu, Hayato

    1981-01-01

    An analytical method for the determination of deuterium concentration in water and hydrogen gas by gas chromatography is described. HD and D 2 in a hydrogen gas sample were separated from H 2 by a column packed with Molecular Sieve 13X, using extra pure hydrogen gas as carrier. A thermal conductivity detector was used. Concentrations of deuterium were determined by comparison with standard samples. The error inherent to the present method was less a 1% on the basis of the calibration curves obtained with the standard samples. The average time required for the analysis was about 3 minutes. (author)

  10. Influence of corrosion layers on quantitative analysis

    International Nuclear Information System (INIS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Roehrich, J.; Strub, E.

    2005-01-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed

  11. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  12. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    International Nuclear Information System (INIS)

    Charland, P.; Peters, T.; McGill Univ., Montreal, Quebec

    1996-01-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer's perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions

  13. Photoacoustic image reconstruction: a quantitative analysis

    Science.gov (United States)

    Sperl, Jonathan I.; Zell, Karin; Menzenbach, Peter; Haisch, Christoph; Ketzer, Stephan; Marquart, Markus; Koenig, Hartmut; Vogel, Mika W.

    2007-07-01

    Photoacoustic imaging is a promising new way to generate unprecedented contrast in ultrasound diagnostic imaging. It differs from other medical imaging approaches, in that it provides spatially resolved information about optical absorption of targeted tissue structures. Because the data acquisition process deviates from standard clinical ultrasound, choice of the proper image reconstruction method is crucial for successful application of the technique. In the literature, multiple approaches have been advocated, and the purpose of this paper is to compare four reconstruction techniques. Thereby, we focused on resolution limits, stability, reconstruction speed, and SNR. We generated experimental and simulated data and reconstructed images of the pressure distribution using four different methods: delay-and-sum (DnS), circular backprojection (CBP), generalized 2D Hough transform (HTA), and Fourier transform (FTA). All methods were able to depict the point sources properly. DnS and CBP produce blurred images containing typical superposition artifacts. The HTA provides excellent SNR and allows a good point source separation. The FTA is the fastest and shows the best FWHM. In our study, we found the FTA to show the best overall performance. It allows a very fast and theoretically exact reconstruction. Only a hardware-implemented DnS might be faster and enable real-time imaging. A commercial system may also perform several methods to fully utilize the new contrast mechanism and guarantee optimal resolution and fidelity.

  14. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  15. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  16. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  17. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  18. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  19. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  20. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  1. [Quantitative analysis of blood loss in liposuction].

    Science.gov (United States)

    Schor, N; Zatz, R M; Mendonça, A R; Takatu, P M; Patto, G S

    1989-01-01

    This study was performed in 15 female patients submitted to suction lipectomy as an isolated procedure, to establish blood loss in the procedure. A wide variation of blood-to-fat ratios was observed (17 to 59%) with a mean blood loss in lipoaspirates of 34 +/- 3%. Internal blood losses occurring in the first 72 post-operative hours were as important as or more important than external losses, and responsible for a mean 7% fall in the level of hemoglobin. Internal blood losses occurred between 72 hours and the 7th to the 10th post-operative days and were responsible for a mean 3% fall in the level of hemoglobin. Blood losses occurring in this study were demonstrated to be greater than usually assumed. Some prophylactic measures are recommended to provide for a safer treatment of these patients: an iron supplementation during the pre-operative period; careful clinical and laboratorial screening for bleeding disorders and for the intake of drugs that can interfere with coagulation; use of smaller-diameter cannulas for aspiration, auto-transfusion when aspirating in excess of 1,000 ml, and limiting the aspiration to 1,500 ml.

  2. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  3. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Quantitative motor performance and sleep benefit in Parkinson disease

    NARCIS (Netherlands)

    van Gilst, Merel; van Mierlo, P.; Bloem, B.R.; Overeem, S.

    2015-01-01

    STUDY OBJECTIVES: Many people with Parkinson disease experience "sleep benefit": temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. DESIGN: Eighteen Parkinson patients with and 20 without

  6. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  7. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  8. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  9. Performance of isobaric and isotopic labeling in quantitative plant proteomics

    DEFF Research Database (Denmark)

    Nogueira, Fábio C S; Palmisano, Giuseppe; Schwämmle, Veit

    2012-01-01

    , and quantitation. In the present work, we have used LC-MS to compare an isotopic (ICPL) and isobaric (iTRAQ) chemical labeling technique to quantify proteins in the endosperm of Ricinus communis seeds at three developmental stages (IV, VI, and X). Endosperm proteins of each stage were trypsin-digested in...

  10. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  11. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  12. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  13. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  14. A temperature-controlled photoelectrochemical cell for quantitative product analysis

    Science.gov (United States)

    Corson, Elizabeth R.; Creel, Erin B.; Kim, Youngsang; Urban, Jeffrey J.; Kostecki, Robert; McCloskey, Bryan D.

    2018-05-01

    In this study, we describe the design and operation of a temperature-controlled photoelectrochemical cell for analysis of gaseous and liquid products formed at an illuminated working electrode. This cell is specifically designed to quantitatively analyze photoelectrochemical processes that yield multiple gas and liquid products at low current densities and exhibit limiting reactant concentrations that prevent these processes from being studied in traditional single chamber electrolytic cells. The geometry of the cell presented in this paper enables front-illumination of the photoelectrode and maximizes the electrode surface area to electrolyte volume ratio to increase liquid product concentration and hence enhances ex situ spectroscopic sensitivity toward them. Gas is bubbled through the electrolyte in the working electrode chamber during operation to maintain a saturated reactant concentration and to continuously mix the electrolyte. Gaseous products are detected by an in-line gas chromatograph, and liquid products are analyzed ex situ by nuclear magnetic resonance. Cell performance was validated by examining carbon dioxide reduction on a silver foil electrode, showing comparable results both to those reported in the literature and identical experiments performed in a standard parallel-electrode electrochemical cell. To demonstrate a photoelectrochemical application of the cell, CO2 reduction experiments were carried out on a plasmonic nanostructured silver photocathode and showed different product distributions under dark and illuminated conditions.

  15. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  16. The Performance of Indian Equity Funds in the Era of Quantitative Easing

    Directory of Open Access Journals (Sweden)

    Ömer Faruk Tan

    2015-10-01

    Full Text Available This study aims to evaluate the performance of Indian equity funds between January 2009 and October 2014. This study period coincides with the period of quantitative easing during which the developing economies in financial markets have been influenced. After the global financial crisis of 2008 came a period of quantitative easing (QE, creating an increase in the money supply and leading to a capital flow from developed countries to developing countries. During this 5-year 10-month period, in which the relevant quantitative easing continued, Indian CNX500 price index yielded approximately 21% compounded on average, per annum. In this study, Indian equity funds are examined in order to compare these funds’ performance within this period. Within this scope, 12 Indian equity funds are chosen. In order to measure these funds’ performances, the Sharpe ratio (1966, Treynor ratio (1965, Jensen’s alpha (1968 methods are used. Jensen’s alpha is also used in identifying selectivity skills of fund managers. Additionally, the Treynor & Mazuy (1966 regression analysis method is applied to show the market timing ability of fund managers.

  17. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  18. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  19. Evaluation of breast lesions by contrast enhanced ultrasound: Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Wan Caifeng; Du Jing; Fang Hua; Li Fenghua; Wang Lin

    2012-01-01

    Objective: To evaluate and compare the diagnostic performance of qualitative, quantitative and combined analysis for characterization of breast lesions in contrast enhanced ultrasound (CEUS), with histological results used as the reference standard. Methods: Ninety-one patients with 91 breast lesions BI-RADS 3–5 at US or mammography underwent CEUS. All lesions underwent qualitative and quantitative enhancement evaluation. Receiver operating characteristic (ROC) curve analysis was performed to evaluate the diagnostic performance of different analytical method for discrimination between benign and malignant breast lesions. Results: Histopathologic analysis of the 91 lesions revealed 44 benign and 47 malignant. For qualitative analysis, benign and malignant lesions differ significantly in enhancement patterns (p z1 ), 0.768 (A z2 ) and 0.926(A z3 ) respectively. The values of A z1 and A z3 were significantly higher than that for A z2 (p = 0.024 and p = 0.008, respectively). But there was no significant difference between the values of A z1 and A z3 (p = 0.625). Conclusions: The diagnostic performance of qualitative and combined analysis was significantly higher than that for quantitative analysis. Although quantitative analysis has the potential to differentiate benign from malignant lesions, it has not yet improved the final diagnostic accuracy.

  20. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  1. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  2. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  3. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  4. Methodology for quantitative evalution of diagnostic performance. Project III

    International Nuclear Information System (INIS)

    Metz, C.E.

    1985-01-01

    Receiver Operation Characteristic (ROC) methodology is now widely recognized as the most satisfactory approach to the problem of measuring and specifying the performance of a diagnostic procedure. The primary advantage of ROC analysis over alternative methodologies is that it seperates differences in diagnostic accuracy that are due to actual differences in discrimination capacity from those that are due to decision threshold effects. Our effort during the past year has been devoted to developing digital computer programs for fitting ROC curves to diagnostic data by maximum likelihood estimation and to developing meaningful and valid statistical tests for assessing the significance of apparent differences between measured ROC curves. FORTRAN programs previously written here for ROC curve fitting and statistical testing have been refined to make them easier to use and to allow them to be run in a large variety of computer systems. We have attempted also to develop two new curve-fitting programs: one for conventional ROC data that assumes a different functional form for the ROC curve, and one that can be used for ''free-response'' ROC data. Finally, we have cooperated with other investigators to apply our techniques to analyze ROC data generated in clinical studies, and we have sought to familiarize the medical community with the advantages of ROC methodology. 36 ref

  5. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  6. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  7. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  8. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  9. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. DAS performance analysis

    International Nuclear Information System (INIS)

    Bates, G.; Bodine, S.; Carroll, T.; Keller, M.

    1984-02-01

    This report begins with an overview of the Data Acquisition System (DAS), which supports several of PPPL's experimental devices. Performance measurements which were taken on DAS and the tools used to make them are then described

  11. Analysis of characteristic performance curves in radiodiagnosis by an observer

    International Nuclear Information System (INIS)

    Kossovoj, A.L.

    1988-01-01

    Methods and ways of construction of performance characteristic curves (PX-curves) in roentgenology, their qualitative and quantitative estimation are described. Estimation of PX curves application for analysis of scintigraphic and sonographic images is presented

  12. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  13. Quantitative assessment of safety barrier performance in the prevention of domino scenarios triggered by fire

    International Nuclear Information System (INIS)

    Landucci, Gabriele; Argenti, Francesca; Tugnoli, Alessandro; Cozzani, Valerio

    2015-01-01

    The evolution of domino scenarios triggered by fire critically depends on the presence and the performance of safety barriers that may have the potential to prevent escalation, delaying or avoiding the heat-up of secondary targets. The aim of the present study is the quantitative assessment of safety barrier performance in preventing the escalation of fired domino scenarios. A LOPA (layer of protection analysis) based methodology, aimed at the definition and quantification of safety barrier performance in the prevention of escalation was developed. Data on the more common types of safety barriers were obtained in order to characterize the effectiveness and probability of failure on demand of relevant safety barriers. The methodology was exemplified with a case study. The results obtained define a procedure for the estimation of safety barrier performance in the prevention of fire escalation in domino scenarios. - Highlights: • We developed a methodology for the quantitative assessment of safety barriers. • We focused on safety barriers aimed at preventing domino effect triggered by fire. • We obtained data on effectiveness and availability of the safety barriers. • The methodology was exemplified with a case study of industrial interest. • The results showed the role of safety barriers in preventing fired domino escalation

  14. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  15. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  16. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  17. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    Science.gov (United States)

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  18. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC

    Directory of Open Access Journals (Sweden)

    Cheng Bai

    2007-01-01

    Full Text Available High-performance liquid chromatography (HPLC analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates. Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml–1/μMol ml–1], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh. K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  19. Single particle transfer for quantitative analysis with total-reflection X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Esaka, Fumitaka; Esaka, Konomi T.; Magara, Masaaki; Sakurai, Satoshi; Usuda, Shigekazu; Watanabe, Kazuo

    2006-01-01

    The technique of single particle transfer was applied to quantitative analysis with total-reflection X-ray fluorescence (TXRF) spectrometry. The technique was evaluated by performing quantitative analysis of individual Cu particles with diameters between 3.9 and 13.2 μm. The direct quantitative analysis of the Cu particle transferred onto a Si carrier gave a discrepancy between measured and calculated Cu amounts due to the absorption effects of incident and fluorescent X-rays within the particle. By the correction for the absorption effects, the Cu amounts in individual particles could be determined with the deviation within 10.5%. When the Cu particles were dissolved with HNO 3 solution prior to the TXRF analysis, the deviation was improved to be within 3.8%. In this case, no correction for the absorption effects was needed for quantification

  20. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  1. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  2. UR10 Performance Analysis

    DEFF Research Database (Denmark)

    Ravn, Ole; Andersen, Nils Axel; Andersen, Thomas Timm

    While working with the UR-10 robot arm, it has become apparent that some commands have undesired behaviour when operating the robot arm through a socket connection, sending one command at a time. This report is a collection of the results optained when testing the performance of the different...

  3. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  4. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  5. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  6. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  7. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  8. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    Directory of Open Access Journals (Sweden)

    Marielle Ernst

    Full Text Available We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists.We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model. Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed.In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling.Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  9. Quantitative Analysis of Aloins and Aloin-Emodin in Aloe Vera Raw Materials and Finished Products Using High-Performance Liquid Chromatography: Single-Laboratory Validation, First Action 2016.09.

    Science.gov (United States)

    Kline, David; Ritruthai, Vicha; Babajanian, Silva; Gao, Quanyin; Ingle, Prashant; Chang, Peter; Swanson, Gary

    2017-05-01

    A single-laboratory validation study is described for a method of quantitative analysis of aloins (aloins A and B) and aloe-emodin in aloe vera raw materials and finished products. This method used HPLC coupled with UV detection at 380 nm for the aloins and 430 nm for aloe-emodin. The advantage of this test method is that the target analytes are concentrated from the sample matrix (either liquid or solid form) using stepwise liquid-liquid extraction (water-ethyl acetate-methanol), followed by solvent evaporation and reconstitution. This sample preparation process is suitable for different forms of products. The concentrating step for aloins and aloe-emodin has enhanced the method quantitation level to 20 parts per billion (ppb). Reversed-phase chromatography using a 250 × 4.6 mm column under gradient elution conditions was used. Mobile phase A is 0.1% acetic acid in water and mobile phase B is 0.1% acetic acid in acetonitrile. The HPLC run starts with a 20% mobile phase B that reaches 35% at 13 min. From 13 to 30 min, mobile phase B is increased from 35 to 100%. From 30 to 40 min, mobile phase B is changed from 100% back to the initial condition of 20% for re-equilibration. The flow rate is 1 mL/min, with a 100 μL injection volume. Baseline separation (Rs > 2.0) for aloins A and B and aloe-emodin was observed under this chromatographic condition. This test method was validated with raw materials of aloe vera 5× (liquid) and aloe vera 200× (powder) and finished products of aloe concentrate (liquid) and aloe (powder). The linearity of the method was studied from 10 to 500 ppb for aloins A and B and aloe-emodin, with correlation coefficients of 0.999964, 0.999957, and 0.999980, respectively. The test method was proven to be specific, precise, accurate, rugged, and suitable for the intended quantitative analysis of aloins and aloe-emodin in raw materials and finished products. The S/N for aloins A and B and aloe-emodin at 10 ppb level were 12, 10, and 8

  10. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  11. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  12. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  13. QUANTITATIVE (GIS AND QUALITATIVE (BPE ASSESSMENTS OF LIBRARY PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Wolfgang F.E. Preiser

    2008-03-01

    Full Text Available This article accounts for the methodological approach used in the creation of the Facilities Master Plan for the Public Library of Cincinnati and Hamilton County in the United States. Libraries are undergoing significant changes with regards to their advancing functions. They have become community centers for learning, whether for children, teens, adults, or seniors. This study uses an approach combining GIS- Geographic Information System with BPE-Building Performance Evaluation to score and categorize branch libraries based on their level of performance. A number of recommended strategies are derived to achieve greater cost-effectiveness and improved service.

  14. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  15. Waste package performance analysis

    International Nuclear Information System (INIS)

    Lester, D.H.; Stula, R.T.; Kirstein, B.E.

    1982-01-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been applied to several package designs. The resulting preliminary assessments were intended for use in making decisions about package development programs. A computer model called BARIER estimates the package life and subsequent rate of release of selected nuclides. The model accounts for temperature, pressure (and resulting stresses), bulk and localized corrosion, and nuclide retardation by the backfill after water intrusion into the waste form. The assessment model assumes a post-closure, flooded, geologic repository. Calculations indicated that, within the bounds of model assumptions, packages could last for several hundred years. Intact backfills of appropriate design may be capable of nuclide release delay times on the order of 10 7 yr for uranium, plutonium, and americium. 8 references, 6 figures, 9 tables

  16. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  17. Quantitative analysis of LISA pathfinder test-mass noise

    International Nuclear Information System (INIS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-01-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3x10 -14 m s -2 /√(Hz) at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise

  18. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  19. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  20. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  1. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  2. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  3. Performance analysis of switching systems

    NARCIS (Netherlands)

    Berg, van den R.A.

    2008-01-01

    Performance analysis is an important aspect in the design of dynamic (control) systems. Without a proper analysis of the behavior of a system, it is impossible to guarantee that a certain design satisfies the system’s requirements. For linear time-invariant systems, accurate performance analyses are

  4. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  5. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  6. A quantitative impact analysis of sensor failures on human operator's decision making in nuclear power plants

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    2004-01-01

    In emergency or accident situations in nuclear power plants, human operators take important roles in generating appropriate control signals to mitigate accident situation. In human reliability analysis (HRA) in the framework of probabilistic safety assessment (PSA), the failure probabilities of such appropriate actions are estimated and used for the safety analysis of nuclear power plants. Even though understanding the status of the plant is basically the process of information seeking and processing by human operators, it seems that conventional HRA methods such as THERP, HCR, and ASEP does not pay a lot of attention to the possibilities of providing wrong information to human operators. In this paper, a quantitative impact analysis of providing wrong information to human operators due to instrument faults or sensor failures is performed. The quantitative impact analysis is performed based on a quantitative situation assessment model. By comparing the situation in which there are sensor failures and the situation in which there are not sensor failures, the impact of sensor failures can be evaluated quantitatively. It is concluded that the impact of sensor failures are quite significant at the initial stages, but the impact is gradually reduced as human operators make more and more observations. Even though the impact analysis is highly dependent on the situation assessment model, it is expected that the conclusions made based on other situation assessment models with be consistent with the conclusion made in this paper. (author)

  7. Quantitative determination of multi markers in five varieties of Withania somnifera using ultra-high performance liquid chromatography with hybrid triple quadrupole linear ion trap mass spectrometer combined with multivariate analysis: Application to pharmaceutical dosage forms.

    Science.gov (United States)

    Chandra, Preeti; Kannujia, Rekha; Saxena, Ankita; Srivastava, Mukesh; Bahadur, Lal; Pal, Mahesh; Singh, Bhim Pratap; Kumar Ojha, Sanjeev; Kumar, Brijesh

    2016-09-10

    An ultra-high performance liquid chromatography electrospray ionization tandem mass spectrometry method has been developed and validated for simultaneous quantification of six major bioactive compounds in five varieties of Withania somnifera in various plant parts (leaf, stem and root). The analysis was accomplished on Waters ACQUITY UPLC BEH C18 column with linear gradient elution of water/formic acid (0.1%) and acetonitrile at a flow rate of 0.3mLmin(-1). The proposed method was validated with acceptable linearity (r(2), 0.9989-0.9998), precision (RSD, 0.16-2.01%), stability (RSD, 1.04-1.62%) and recovery (RSD ≤2.45%), under optimum conditions. The method was also successfully applied for the simultaneous determination of six marker compounds in twenty-six marketed formulations. Hierarchical cluster analysis and principal component analysis were applied to discriminate these twenty-six batches based on characteristics of the bioactive compounds. The results indicated that this method is advance, rapid, sensitive and suitable to reveal the quality of Withania somnifera and also capable of performing quality evaluation of polyherbal formulations having similar markers/raw herbs. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  9. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  10. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  11. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  12. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  13. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  14. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  15. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  16. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  17. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  18. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  19. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  20. An approach for quantitative evaluation of operator performance in emergency conditions

    International Nuclear Information System (INIS)

    Ujita, Hiroshi; Kubota, Ryuji; Kawano, Ryutaro.

    1992-01-01

    To understand expert behavior and define what constitutes good performance, human performance quantification was tried from viewpoints of not only error, but also various cognitive, psychological, and behavioral characteristics. Quantitative and qualitative indexes of human performance are proposed for both operator individual and crew points of view, among which cognitive and behavioral aspects are the most important. (author)

  1. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  2. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  3. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  4. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  5. Quantitative analysis with energy dispersive X-ray fluorescence analyser

    International Nuclear Information System (INIS)

    Kataria, S.K.; Kapoor, S.S.; Lal, M.; Rao, B.V.N.

    1977-01-01

    Quantitative analysis of samples using radioisotope excited energy dispersive x-ray fluorescence system is described. The complete set-up is built around a locally made Si(Li) detector x-ray spectrometer with an energy resolution of 220 eV at 5.94 KeV. The photopeaks observed in the x-ray fluorescence spectra are fitted with a Gaussian function and the intensities of the characteristic x-ray lines are extracted, which in turn are used for calculating the elemental concentrations. The results for a few typical cases are presented. (author)

  6. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  7. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  8. Quantitative performance allocation of multi-barrier system for high-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Ahn, Joon-Hong; Ikeda, Takao; Ohe, Toshiaki

    1995-01-01

    Performance assessment of each barrier consisting of geologic disposal system for high-level radioactive wastes is carried out quantitatively, and key radionuclides and parameters are pointed out. Chemical compositions and solubilities of radionuclides under repository conditions are determined by PHREEQE code staring from compositions of granitic groundwater observed in Japan. Glass dissolution analysis based on mass transfer theory and precipitation analysis have been done in order to determine the inner boundary condition for radionuclide diffusion through a bentonite-filled buffer region, where multi-member decay chain and isotopic sharing of solubility at the inner boundary are considered. Natural barrier is treated as homogeneous porous rock, or porous rock with infinite planar fractures. Performance of each barrier is evaluated in terms of non-dimensionalized hazard defined as the ratio of annual radioactivity release from each barrier to the annual limit on intake. At the outer edge of the engineered barriers, 239 Pu is the key unclide to the performance, whereas at the exit of the natural barrier, weakly-sorbing fission product nuclides such as 135 Cs, 129 I and 99 Tc dominate the hazard. (author) 50 refs

  9. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    Science.gov (United States)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of 30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  10. Rapid Quantitative Analysis of Naringenin in the Fruit Bodies of Inonotus vaninii by Two-phase Acid Hydrolysis Followed by Reversed Phase-high Performance Liquid Chromatography-ultra Violet.

    Science.gov (United States)

    Guohua, Xia; Pan, Ruirong; Bao, Rui; Ge, Yanru; Zhou, Cunshan; Shen, Yuping

    2017-01-01

    Sanghuang is one of mystical traditional Chinese medicines recorded earliest 2000 years ago, that included various fungi of Inonotus genus and was well-known for antitumor effect in modern medicine. Inonotus vaninii is grown in natural forest of Northeastern China merely and used as Sanghuang commercially, but it has no quality control specification until now. This study was to establish a rapid method of two-phase acid hydrolysis followed by reversed phase-high performance liquid chromatography-ultra violet (RP-HPLC-UV) to quantify naringenin in the fruit body of I. vaninii . Sample solution was prepared by pretreatment of raw material in two-phase acid hydrolysis and the hydrolysis technology was optimized. After reconstitution, analysis was performed using RP-HPLC-UV. The method validation was investigated and the naringenin content of sample and comparison were determined. The naringenin was obtained by two-phase acid hydrolysis method, namely, 10.0 g of raw material was hydrolyzed in 200 mL of 1% sulfuric acid aqueous solution (v/v) and 400 mL of chloroform in oil bath at 110°C for 2 h. Good linearity ( r = 0.9992) was achieved between concentration of analyte and peak area. The relative standard deviation (RSD) of precision was 2.47% and the RSD of naringenin contents for repeatability was 3.13%. The accuracy was supported with recoveries at 96.37%, 97.30%, and 99.31%. The sample solution prepared using the proposed method contained higher content of naringenin than conventional method and was stable for 8 h. Due to the high efficiency of sample preparation and high reliability of the HPLC method, it is feasible to use this method for routine analysis of naringenin in the fungus. A convenient two-phase acid hydrolysis was employed to produce naringenin from raw material, and then an efficient and reliable reversed phase-high performance liquid chromatography-ultra violet method was established to monitor naringenin in the fruit bodies of Inonotus vaninii

  11. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Passaro, Bruno Martins

    2011-01-01

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR 20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  12. Notes on human performance analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Pedersen, O.M.; Rasmussen, J.

    1981-06-01

    This paper contains a framework for the integration of observation and analysis of human performance in nuclear environments - real or simulated. It identifies four main sources of data, and describes the characteristic data types and methods of analysis for each source in relation to a common conceptual background. The general conclusion is that it is highly useful to combine the knowledge and experience from different contexts into coherent picture of how nuclear operators perform under varying circumstances. (author)

  13. Impact of quantitative feedback and benchmark selection on radiation use by cardiologists performing cardiac angiography

    International Nuclear Information System (INIS)

    Smith, I. R.; Cameron, J.; Brighouse, R. D.; Ryan, C. M.; Foster, K. A.; Rivers, J. T.

    2013-01-01

    Audit of and feedback on both group and individual data provided immediately after the point of care and compared with realistic benchmarks of excellence have been demonstrated to drive change. This study sought to evaluate the impact of immediate benchmarked quantitative case-based performance feedback on the clinical practice of cardiologists practicing at a private hospital in Brisbane, Australia. The participating cardiologists were assigned to one of two groups: Group 1 received patient and procedural details for review and Group 2 received Group 1 data plus detailed radiation data relating to the procedures and comparative benchmarks. In Group 2, Linear-by-Linear Association analysis suggests a link between change in radiation use and initial radiation dose category (p50.014) with only those initially 'challenged' by the benchmarks showing improvement. Those not 'challenged' by the benchmarks deteriorated in performance compared with those starting well below the benchmarks showing greatest increase in radiation use. Conversely, those blinded to their radiation use (Group 1) showed general improvement in radiation use throughout the study compared with those performing initially close to the benchmarks showing greatest improvement. This study shows that use of non-challenging benchmarks in case-based radiation risk feedback does not promote a reduction in radiation use; indeed, it may contribute to increased doses. Paradoxically, cardiologists who are aware of performance monitoring but blinded to individual case data appear to maintain, if not reduce, their radiation use. (authors)

  14. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    Science.gov (United States)

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  15. Performances of CN-columns for the analysis of γ-oryzanol and its p-coumarate and caffeate derivatives by normal phase HPLC and a validated method of quantitation.

    Science.gov (United States)

    D'Ambrosio, Michele

    2013-06-15

    γ-Oryzanol is an important phytochemical used in pharmaceutical, alimentary and cosmetic preparations. The present article, for the first time, discloses the performances of NP-HPLC in separating γ-oryzanol components and develops a validated method for its routine quantification. The analysis is performed on a cyanopropyl bonded column using the hexane/MTBE gradient elution and UV detection at 325 nm. The method allows: the separation of steryl ferulate, p-coumarate and caffeate esters, the separation of cis- from trans-ferulate isomers, the splitting of steroid moieties into saturated and unsaturated at the side chain. The optimised method provides excellent linear response (R(2)=0.99997), high precision (RSD<1.0%) and satisfactory accuracy (R(∗)=70-86%). In conclusion, the established method presents the details of the procedure and the experimental conditions in order to achieve the required precision and instrumental accuracy. The method is fast and sensitive and it could be a suitable tool for quality assurance and determination of origin. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  17. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  18. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  19. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  20. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu

    2001-01-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99m Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99m Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  1. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  2. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2001-04-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with {sup 99m}Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or {sup 99m}Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  3. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  4. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  5. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  6. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  7. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy-EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  8. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  9. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  10. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  11. Visualization and quantitative analysis of the CSF pulsatile flow with cine MR phase imaging

    International Nuclear Information System (INIS)

    Katayama, Shinji; Itoh, Takahiko; Kinugasa, Kazushi; Asari, Shoji; Nishimoto, Akira; Tsuchida, Shohei; Ono, Atsushi; Ikezaki, Yoshikazu; Yoshitome, Eiji.

    1991-01-01

    The visualization and the quantitative analysis of the CSF pulsatile flow were performed on ten healthy volunteers with cine MR phase imaging, a combination of the phase-contrast technique and the cardiac-gating technique. The velocities appropriate for the visualization and the quantitative analysis of the CSF pulsatile flow were from 6.0 cm/sec to 15.0 cm/sec. The applicability of this method for the quantitative analysis was proven with a steady-flow phantom. Phase images clearly demonstrated a to-and-fro motion of the CSF flow in the anterior subarachnoid space and in the posterior subarachnoid space. The flow pattern of CSF on healthy volunteers depends on the cardiac cycle. In the anterior subarachnoid space, the cephalic CSF flow continued until a 70-msec delay after the R-wave of the ECG and then reversed to caudal. At 130-190 msec, the caudal CSF flow reached its maximum velocity; thereafter it reversed again to cephalic. The same turn appeared following the phase, but then the amplitude decreased. The cephalic peaked at 370-430 msec, while the caudal peaked at 490-550 msec. The flow pattern of the CSF flow in the posterior subarachnoid space was almost identical to that in the anterior subarachnoid space. Cine MR phase imaging is thus useful for the visualization and the quantitative analysis of the CSF pulsative flow. (author)

  12. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  13. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  14. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  15. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  17. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  18. Quantitative analysis and classification of AFM images of human hair.

    Science.gov (United States)

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.

  19. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem

    DEFF Research Database (Denmark)

    Huang, Honggang; Larsen, Martin Røssel; Palmisano, Giuseppe

    2014-01-01

    in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160...... phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant...... and rigor mortis development in PM muscle. BIOLOGICAL SIGNIFICANCE: The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed...

  20. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    Directory of Open Access Journals (Sweden)

    Xinsheng Peng

    2014-01-01

    Full Text Available A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8-triethylamine (50 : 50 : 0.1% with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y=10706x-2959 (R2=1.0. The average recovery is 101.7%; RSD=2.22%  (n=9. This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle.

  1. TACO: fuel pin performance analysis

    International Nuclear Information System (INIS)

    Stoudt, R.H.; Buchanan, D.T.; Buescher, B.J.; Losh, L.L.; Wilson, H.W.; Henningson, P.J.

    1977-08-01

    The thermal performance of fuel in an LWR during its operational lifetime must be described for LOCA analysis as well as for other safety analyses. The determination of stored energy in the LOCA analysis, for example, requires a conservative fuel pin thermal performance model that is capable of calculating fuel and cladding behavior, including the gap conductance between the fuel and cladding, as a function of burnup. The determination of parameters that affect the fuel and cladding performance, such as fuel densification, fission gas release, cladding dimensional changes, fuel relocation, and thermal expansion, should be accounted for in the model. Babcock and Wilcox (B and W) has submitted a topical report, BAW-10087P, December 1975, which describes their thermal performance model TACO. A summary of the elements that comprise the TACO model and an evaluation are presented

  2. A new quantitative analysis on nitriding kinetics in the oxidized Zry-4 at 900-1200 .deg. C

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sanggi [ACT Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    Two major roles of nitrogen on the zirconium based cladding degradation were identified: mechanical degradation of the cladding, and the additional chemical heat release. It has long been known that accelerated oxidation can occur in air due to the nitrogen. In addition, significant uptake of nitrogen can also occur. The nitriding of pre-oxidized zirconium based alloys leads to micro porous and less coherent oxide scales. This paper aims to quantitatively investigate the nitriding mechanism and kinetics by proposing a new methodology that is coupled with the mass balance analysis and the optical microscope image processing analysis. A new quantitative analysis methodology is described in chapter 2 and the investigation of the nitriding kinetics is performed in chapter 3. The experimental details are previously reported in. Previously only qualitative analysis was performed in, and hence the quantitative analysis will be performed in this paper. In this paper, the nitriding kinetics and mechanism were quantitatively analyzed by the new proposed analysis methods: the mass balance analysis and the optical microscope image processing analysis. Using these combined methods, the mass gain curves and the optical microscopes are analyzed in very detail, and the mechanisms of nitriding accelerated, stabilized and saturated behaviors were well understood. This paper has two very distinctive achievements as follows: 1) Development of very effective quantitative analysis methods only using two main results of oxidation tests: No detailed analytical sample measurements (e.g. TEM, EPMA and so on.) were required. These methods can effectively reduce the cost and effort of the post-test investigation. 2) The first identification of the nitriding behaviors and its very accurate analysis in a quantitative way. Based on this quantitative analysis results on the nitriding kinetics, these new findings will contribute significantly the understanding the air oxidation behaviors and model

  3. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  4. Quantitative image analysis of WE43-T6 cracking behavior

    International Nuclear Information System (INIS)

    Ahmad, A; Yahya, Z

    2013-01-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  5. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  6. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  7. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  8. Quantitative image analysis for investigating cell-matrix interactions

    Science.gov (United States)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  9. Quantitative Motion Analysis of Tai Chi Chuan: The Upper Extremity Movement

    Directory of Open Access Journals (Sweden)

    Tsung-Jung Ho

    2018-01-01

    Full Text Available The quantitative and reproducible analysis of the standard body movement in Tai Chi Chuan (TCC was performed in this study. We aimed to provide a reference of the upper extremities for standardizing TCC practice. Microsoft Kinect was used to record the motion during the practice of TCC. The preparation form and eight essential forms of TCC performed by an instructor and 101 practitioners were analyzed in this study. The instructor completed an entire TCC practice cycle and performed the cycle 12 times. An entire cycle of TCC was performed by practitioners and images were recorded for statistics analysis. The performance of the instructor showed high similarity (Pearson correlation coefficient (r=0.71~0.84 to the first practice cycle. Among the 9 forms, lay form had the highest similarity (rmean=0.90 and push form had the lowest similarity (rmean=0.52. For the practitioners, ward off form (rmean=0.51 and roll back form (rmean=0.45 had the highest similarity with moderate correlation. We used Microsoft Kinect to record the spatial coordinates of the upper extremity joints during the practice of TCC and the data to perform quantitative and qualitative analysis of the joint positions and elbow joint angle.

  10. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  11. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  12. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  13. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course.

    Science.gov (United States)

    Flanagan, K M; Einarson, J

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre-post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student's math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student's grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide "instructor actions" from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. © 2017 K. M. Flanagan and J. Einarson. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http

  14. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  15. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  16. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    Science.gov (United States)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  17. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    Science.gov (United States)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  18. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. A framework for the quantitative assessment of performance-based system resilience

    International Nuclear Information System (INIS)

    Tran, Huy T.; Balchanos, Michael; Domerçant, Jean Charles; Mavris, Dimitri N.

    2017-01-01

    Increasing system complexity and threat uncertainty require the consideration of resilience in the design and analysis of engineered systems. While the resilience engineering community has begun to converge on a definition and set of characteristics for resilience, methods for quantifying the concept are still limited in their applicability to system designers. This paper proposes a framework for assessing resilience that focuses on the ability of a system to absorb disruptions, recover from them, and adapt over time. The framework extends current approaches by explicitly considering temporal aspects of system responses to disruptions, volatility in system performance data, and the possibility of multiple disruption events. Notional system performance data is generated using the logistic function, providing an experimental platform for a parametric comparison of the proposed resilience metric with an integration-based metric. An information exchange network model is used to demonstrate the applicability of the framework towards system design tradeoff studies using stochastic simulations. The presented framework is domain-agnostic and flexible, such that it can be applied to a variety of systems and adjusted to focus on specific aspects of resilience. - Highlights: • We propose a quantitative framework and metrics for assessing system resilience. • Metrics focus on absorption, recovery, and adaptation to disruptions. • The framework accepts volatile data and is easily automated for simulation studies. • The framework is applied to a model of adaptive information exchange networks. • Results show benefits of network adaptation against random and targeted threats.

  20. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    Science.gov (United States)

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  1. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Villot-Kadri, M.; Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Michel, K.; Bruyère, D.; Laperche, V. [BRGM, Service Métrologie, Monitoring et Analyse, 3 avenue Claude Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux1.fr [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France)

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced.

  2. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  3. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Villot-Kadri, M.; Ismaël, A.; Gallou, G.; Michel, K.; Bruyère, D.; Laperche, V.; Canioni, L.; Bousquet, B.

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced

  4. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  5. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  6. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  7. From POOSL to UPPAAL : transformation and quantitative analysis

    NARCIS (Netherlands)

    Xing, J.; Theelen, B.D.; Langerak, R.; Pol, van de J.C.; Tretmans, J.; Voeten, J.P.M.

    2010-01-01

    POOSL (Parallel Object-Oriented Specification Language) is a powerful general purpose system-level modeling language. In research on design space exploration of motion control systems, POOSL has been used to construct models for performance analysis. The considered motion control algorithms are

  8. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  9. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  10. 3D vs 2D laparoscopic systems: Development of a performance quantitative validation model.

    Science.gov (United States)

    Ghedi, Andrea; Donarini, Erica; Lamera, Roberta; Sgroi, Giovanni; Turati, Luca; Ercole, Cesare

    2015-01-01

    The new technology ensures 3D laparoscopic vision by adding depth to the traditional two dimensions. This realistic vision gives the surgeon the feeling of operating in real space. Hospital of Treviglio-Caravaggio isn't an university or scientific institution; in 2014 a new 3D laparoscopic technology was acquired therefore it led to evaluation of the of the appropriateness in term of patient outcome and safety. The project aims at achieving the development of a quantitative validation model that would ensure low cost and a reliable measure of the performance of 3D technology versus 2D mode. In addition, it aims at demonstrating how new technologies, such as open source hardware and software and 3D printing, could help research with no significant cost increase. For these reasons, in order to define criteria of appropriateness in the use of 3D technologies, it was decided to perform a study to technically validate the use of the best technology in terms of effectiveness, efficiency and safety in the use of a system between laparoscopic vision in 3D and the traditional 2D. 30 surgeons were enrolled in order to perform an exercise through the use of laparoscopic forceps inside a trainer. The exercise consisted of having surgeons with different level of seniority, grouped by type of specialization (eg. surgery, urology, gynecology), exercising videolaparoscopy with two technologies (2D and 3D) through the use of a anthropometric phantom. The target assigned to the surgeon was that to pass "needle and thread" without touching the metal part in the shortest time possible. The rings selected for the exercise had each a coefficient of difficulty determined by depth, diameter, angle from the positioning and from the point of view. The analysis of the data collected from the above exercise has mathematically confirmed that the 3D technique ensures a learning curve lower in novice and greater accuracy in the performance of the task with respect to 2D.

  11. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  12. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  13. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  14. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D. [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study.

  15. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    International Nuclear Information System (INIS)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D.

    2012-01-01

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study

  16. QUANTITATIVE EEG COMPARATIVE ANALYSIS BETWEEN AUTISM SPECTRUM DISORDER (ASD AND ATTENTION DEFICIT HYPERACTIVITY DISORDER (ADHD

    Directory of Open Access Journals (Sweden)

    Plamen D. Dimitrov

    2017-01-01

    Full Text Available Background: Autism is a mental developmental disorder, manifested in the early childhood. Attention deficit hyperactivity disorder is another psychiatric condition of the neurodevelopmental type. Both disorders affect information processing in the nervous system, altering the mechanisms which control how neurons and their synapses are connected and organized. Purpose: To examine if quantitative EEG assessment is sensitive and simple enough to differentiate autism from attention deficit hyperactivity disorder and neurologically typical children. Material and methods: Quantitative EEG is a type of electrophysiological assessment that uses computerized mathematical analysis to convert the raw waveform data into different frequency ranges. Each frequency range is averaged across a sample of data and quantified into mean amplitude (voltage in microvolts mV. We performed quantitative EEG analysis and compared 4 cohorts of children (aged from 3 to 7 years: with autism (high [n=27] and low [n=52] functioning, with attention deficit hyperactivity disorder [n=34], and with typical behavior [n75]. Results: Our preliminary results show that there are significant qEEG differences between the groups of patients and the control cohort. The changes affect the potential levels of delta-, theta-, alpha-, and beta- frequency spectrums. Conclusion: The present study shows some significant quantitative EEG findings in autistic patients. This is a step forward in our efforts, aimed at defining specific neurophysiologic changes, in order to develop and refine strategies for early diagnosis of autism spectrum disorders, differentiation from other development conditions in childhood, detection of specific biomarkers and early initiation of treatment.

  17. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Wu

    Full Text Available ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  18. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  19. Quantitative analysis of pulmonary perfusion using time-resolved parallel 3D MRI - initial results

    International Nuclear Information System (INIS)

    Fink, C.; Buhmann, R.; Plathow, C.; Puderbach, M.; Kauczor, H.U.; Risse, F.; Ley, S.; Meyer, F.J.

    2004-01-01

    Purpose: to assess the use of time-resolved parallel 3D MRI for a quantitative analysis of pulmonary perfusion in patients with cardiopulmonary disease. Materials and methods: eight patients with pulmonary embolism or pulmonary hypertension were examined with a time-resolved 3D gradient echo pulse sequence with parallel imaging techniques (FLASH 3D, TE/TR: 0.8/1.9 ms; flip angle: 40 ; GRAPPA). A quantitative perfusion analysis based on indicator dilution theory was performed using a dedicated software. Results: patients with pulmonary embolism or chronic thromboembolic pulmonary hypertension revealed characteristic wedge-shaped perfusion defects at perfusion MRI. They were characterized by a decreased pulmonary blood flow (PBF) and pulmonary blood volume (PBV) and increased mean transit time (MTT). Patients with primary pulmonary hypertension or eisenmenger syndrome showed a more homogeneous perfusion pattern. The mean MTT of all patients was 3.3 - 4.7 s. The mean PBF and PBV showed a broader interindividual variation (PBF: 104-322 ml/100 ml/min; PBV: 8 - 21 ml/100 ml). Conclusion: time-resolved parallel 3D MRI allows at least a semi-quantitative assessment of lung perfusion. Future studies will have to assess the clinical value of this quantitative information for the diagnosis and management of cardiopulmonary disease. (orig.) [de

  20. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  1. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  2. Computer-assisted sequential quantitative analysis of gallium scans in pulmonary sarcoidosis

    International Nuclear Information System (INIS)

    Rohatgi, P.K.; Bates, H.R.; Noss, R.W.

    1985-01-01

    Fifty-one sequential gallium citrate scans were performed in 22 patients with biopsy-proven sarcoidosis. A computer-assisted quantitative analysis of these scans was performed to obtain a gallium score. The changes in gallium score were correlated with changes in serum angiotensin converting enzyme (SACE) activity and objective changes in clinical status. There was a good concordance between changes in gallium score, SACE activity and clinical assessment in patients with sarcoidosis, and changes in gallium index were slightly superior to SACE index in assessing activity of sarcoidosis. (author)

  3. Using Performance Tasks to Improve Quantitative Reasoning in an Introductory Mathematics Course

    Science.gov (United States)

    Kruse, Gerald; Drews, David

    2013-01-01

    A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant…

  4. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  5. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  6. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  7. Quantitative assessment of early diabetic retinopathy using fractal analysis.

    Science.gov (United States)

    Cheung, Ning; Donaghue, Kim C; Liew, Gerald; Rogers, Sophie L; Wang, Jie Jin; Lim, Shueh-Wen; Jenkins, Alicia J; Hsu, Wynne; Li Lee, Mong; Wong, Tien Y

    2009-01-01

    Fractal analysis can quantify the geometric complexity of the retinal vascular branching pattern and may therefore offer a new method to quantify early diabetic microvascular damage. In this study, we examined the relationship between retinal fractal dimension and retinopathy in young individuals with type 1 diabetes. We conducted a cross-sectional study of 729 patients with type 1 diabetes (aged 12-20 years) who had seven-field stereoscopic retinal photographs taken of both eyes. From these photographs, retinopathy was graded according to the modified Airlie House classification, and fractal dimension was quantified using a computer-based program following a standardized protocol. In this study, 137 patients (18.8%) had diabetic retinopathy signs; of these, 105 had mild retinopathy. Median (interquartile range) retinal fractal dimension was 1.46214 (1.45023-1.47217). After adjustment for age, sex, diabetes duration, A1C, blood pressure, and total cholesterol, increasing retinal vascular fractal dimension was significantly associated with increasing odds of retinopathy (odds ratio 3.92 [95% CI 2.02-7.61] for fourth versus first quartile of fractal dimension). In multivariate analysis, each 0.01 increase in retinal vascular fractal dimension was associated with a nearly 40% increased odds of retinopathy (1.37 [1.21-1.56]). This association remained after additional adjustment for retinal vascular caliber. Greater retinal fractal dimension, representing increased geometric complexity of the retinal vasculature, is independently associated with early diabetic retinopathy signs in type 1 diabetes. Fractal analysis of fundus photographs may allow quantitative measurement of early diabetic microvascular damage.

  8. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  9. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    Science.gov (United States)

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  10. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  11. A quantitative performance evaluation of the EM algorithm applied to radiographic images

    International Nuclear Information System (INIS)

    Brailean, J.C.; Sullivan, B.J.; Giger, M.L.; Chen, C.T.

    1991-01-01

    In this paper, the authors quantitatively evaluate the performance of the Expectation Maximization (EM) algorithm as a restoration technique for radiographic images. The perceived signal-to-noise ratio (SNR), of simple radiographic patterns processed by the EM algorithm are calculated on the basis of a statistical decision theory model that includes both the observer's visual response function and a noise component internal to the eye-brain system. The relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to quantitatively compare the effects of the EM algorithm to two popular image enhancement techniques: contrast enhancement (windowing) and unsharp mask filtering

  12. Quantitative analysis of complexes in electron irradiated CZ silicon

    International Nuclear Information System (INIS)

    Inoue, N.; Ohyama, H.; Goto, Y.; Sugiyama, T.

    2007-01-01

    Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1x10 15 -1x10 17 cm -3 ) and helium dose (5x10 12 -5x10 13 cm -2 ) or electron dose (1x10 15 -1x10 17 cm -2 ) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial grade with low electron dose can be detected clearly. Concentration of these complexes is estimated. It is clarified that the complex configuration and thermal behavior in low carbon and low dose samples is simple and almost confined within the individual complex family compared to those in high concentration and high dose samples. Well-established complex behavior in electron-irradiated sample is compared to that in He-irradiated samples, obtained by deep level transient spectroscopy (DLTS) or cathodoluminescence (CL), which had close relation to the Si power device performance

  13. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  14. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  15. Quantitative analysis of the renal aging in rats. Stereological study

    OpenAIRE

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins Filho, Eduardo Lopes; Fraga, Rogério de

    2016-01-01

    ABSTRACT PURPOSE: To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. METHODS: Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glo...

  16. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  17. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  18. Qualitative and quantitative analysis of plutonium in solid waste drums

    International Nuclear Information System (INIS)

    Anno, Jacques; Escarieux, Emile

    1977-01-01

    An assessment of the results given by a study carried out for the development of qualitative and quantitative analysis, by γ spectrometry, of plutonium in solid waste drums is presented. After having reminded the standards and their incidence on the quantities of plutonium to be measured (application at industrial Pu: 20% of Pu 240 ) the equipment used is described. Measurement station provided with a mechanical system consisting of: a rail and a pulley block to bring the drums; a pit and a hydraulic jack with a rotating platform. The detection instrumentation consisting of: a high volume coaxial Ge(Li) detector with a γ ray resolution of 2 keV; an associated electronic; a processing of data by a 'Plurimat 20' minicomputer. Principles of the identification and measurements are specified and supported by experimental results. They are the following: determination of the quality of Pu by measuring the ratio between the γ ray intensities of the 239 Pu 129 keV and of the 241 Pu 148 keV; measurement of the 239 Pu mass by estimating the γ ray counting rate of the 375 keV from the calibrating curves given by plutonium samples varying from 32 mg to 80 g; correction of the results versus the source position into the drum and versus the filling in plastic materials into this drum. The experimental results obtained over 40 solid waste drums are presented along with the error estimates [fr

  19. FFT transformed quantitative EEG analysis of short term memory load.

    Science.gov (United States)

    Singh, Yogesh; Singh, Jayvardhan; Sharma, Ratna; Talwar, Anjana

    2015-07-01

    The EEG is considered as building block of functional signaling in the brain. The role of EEG oscillations in human information processing has been intensively investigated. To study the quantitative EEG correlates of short term memory load as assessed through Sternberg memory test. The study was conducted on 34 healthy male student volunteers. The intervention consisted of Sternberg memory test, which runs on a version of the Sternberg memory scanning paradigm software on a computer. Electroencephalography (EEG) was recorded from 19 scalp locations according to 10-20 international system of electrode placement. EEG signals were analyzed offline. To overcome the problems of fixed band system, individual alpha frequency (IAF) based frequency band selection method was adopted. The outcome measures were FFT transformed absolute powers in the six bands at 19 electrode positions. Sternberg memory test served as model of short term memory load. Correlation analysis of EEG during memory task was reflected as decreased absolute power in Upper alpha band in nearly all the electrode positions; increased power in Theta band at Fronto-Temporal region and Lower 1 alpha band at Fronto-Central region. Lower 2 alpha, Beta and Gamma band power remained unchanged. Short term memory load has distinct electroencephalographic correlates resembling the mentally stressed state. This is evident from decreased power in Upper alpha band (corresponding to Alpha band of traditional EEG system) which is representative band of relaxed mental state. Fronto-temporal Theta power changes may reflect the encoding and execution of memory task.

  20. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, P; Weeke, B; Loewenstein, H [Rigshospitalet, Copenhagen (Denmark)

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10/sup 4/, 2 x 10/sup 4/, 2 x 10/sup 5/ dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A.

  1. Quantitative produced water analysis using mobile 1H NMR

    International Nuclear Information System (INIS)

    Wagner, Lisabeth; Fridjonsson, Einar O; May, Eric F; Stanwix, Paul L; Graham, Brendan F; Carroll, Matthew R J; Johns, Michael L; Kalli, Chris

    2016-01-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1 H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1 H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1 H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1–30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography. (paper)

  2. Photographers’ Nomenclature Units: A Structural and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Margarita A. Mihailova

    2017-11-01

    Full Text Available Addressing the needs of cross and intercultural communication as well as the methodology of contrastive research, the paper presents the results of the complex analysis conducted to describe semantic and pragmatic parameters of nomenclature units denoting photography equipment in the modern Russian informal discourse of professional photographers. The research is exemplified by 34 original nomenclature units and their 34 Russian equivalents used in 6871 comments posted at “Клуб.Foto.ru” web-site in 2015. The structural and quantitative analyses of photographers’ nomenclature demonstrate the users’ morphological and graphic preferences and indirectly reflect their social and professional values. The corpus-based approach developed by Kast-Aigner (2009: 141 was applied in the study with the aim to identify the nomenclature units denoting photography equipment, validate and elaborate the data of the existing corpus. The research also throws light on the problems of professional language development and derivational processes. The perspective of the study lies in the research of the broader context of professional nomenclature.

  3. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  4. Social media in epilepsy: A quantitative and qualitative analysis.

    Science.gov (United States)

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Quantitative aspects of the clinical performance of transverse tripolar spinal cord stimulation.

    Science.gov (United States)

    Wesselink, W A; Holsheimer, J; King, G W; Torgerson, N A; Boom, H B

    1999-01-01

    A multicenter study was initiated to evaluate the performance of the transverse tripolar system for spinal cord stimulation. Computer modeling had predicted steering of paresthesia with a dual channel stimulator to be the main benefit of the system. The quantitative analysis presented here includes the results of 484 tests in 30 patients. For each test, paresthesia coverage as a function of voltage levels was stored in a computerized database, including a body map which enabled calculation of the degree of paresthesia coverage of separate body areas, as well as the overlap with the painful areas. The results show that with the transverse tripolar system steering of the paresthesia is possible, although optimal steering requires proper placement of the electrode with respect to the spinal cord. Therefore, with this steering ability as well as a larger therapeutic stimulation window as compared to conventional systems, we expect an increase of the long-term efficacy of spinal cord stimulation. Moreover, in view of the stimulation-induced paresthesia patterns, the system allows selective stimulation of the medial dorsal columns.

  6. Performance of refractometry in quantitative estimation of isotopic concentration of heavy water in nuclear reactor

    International Nuclear Information System (INIS)

    Dhole, K.; Roy, M.; Ghosh, S.; Datta, A.; Tripathy, M.K.; Bose, H.

    2013-01-01

    Highlights: ► Rapid analysis of heavy water samples, with precise temperature control. ► Entire composition range covered. ► Both variations in mole and wt.% of D 2 O in the heavy water sample studied. ► Standard error of calibration and prediction were estimated. - Abstract: The method of refractometry has been investigated for the quantitative estimation of isotopic concentration of heavy water (D 2 O) in a simulated water sample. Feasibility of refractometry as an excellent analytical technique for rapid and non-invasive determination of D 2 O concentration in water samples has been amply demonstrated. Temperature of the samples has been precisely controlled to eliminate the effect of temperature fluctuation on refractive index measurement. The method is found to exhibit a reasonable analytical response to its calibration performance over the purity range of 0–100% D 2 O. An accuracy of below ±1% in the measurement of isotopic purity of heavy water for the entire range could be achieved

  7. Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.

    Science.gov (United States)

    Romeo, Elizabeth M

    2010-07-01

    The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena. Copyright 2010, SLACK Incorporated.

  8. Quantitative Determination of Compounds from Akebia quinata by High-Performance Liquid Chromatography

    International Nuclear Information System (INIS)

    Yen, Nguyen; Thu, Nguyen; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun; Lee, Jae Hyun; Kim, Jeong Ah; Son, Jong Keun; Choi, Jae Sui; Woo, Eun Rhan

    2014-01-01

    To provide the scientific corroboration of the traditional uses of Akebia quinata (Thunb.) Decne., a detailed analytical examination of A. quinata stems was carried out using a reversed-phase high performance liquid chromatography (RP-HPLC) method coupled to photodiode array detector (PDA) for the simultaneous determination of four phenolic substances; cuneataside D, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside, 3-caffeoylquinic acid and calceolarioside B. Particular attention was focused on the main compound, 3-caffeoylquinic acid, which has a range of biological functions. In addition, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside was considered as a discernible marker of A. quinata from its easy confuse plants. The contents of compounds 2 and 3 ranged from 0.72 to 2.68 mg/g and from 1.66 to 5.64 mg/g, respectively. The validation data indicated that this HPLC/PDA assay was used successfully to quantify the four phenolic compounds in A. quinata from different locations using relatively simple conditions and procedures. The pattern-recognition analysis data from 53 samples classified them into two groups, allowing discrimination between A. quinata and comparable herbs. The results suggest that the established HPLC/PDA method is suitable for quantitation and pattern-recognition analyses for a quality evaluation of this medicinal herb

  9. Quantitative Determination of Compounds from Akebia quinata by High-Performance Liquid Chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Yen, Nguyen; Thu, Nguyen; Zhao, Bing Tian; Woo, Mi Hee; Min, Byung Sun [Catholic Univ. of Daegu, Gyeongsan (Korea, Republic of); Lee, Jae Hyun [Dongguk Univ., Yongin (Korea, Republic of); Kim, Jeong Ah [Kyungpook National Univ., Daegu (Korea, Republic of); Son, Jong Keun [Yeungnam Univ., Gyeongsan (Korea, Republic of); Choi, Jae Sui [Pukyung National Univ., Busan (Korea, Republic of); Woo, Eun Rhan [Chosun Univ., Gwangju (Korea, Republic of)

    2014-07-15

    To provide the scientific corroboration of the traditional uses of Akebia quinata (Thunb.) Decne., a detailed analytical examination of A. quinata stems was carried out using a reversed-phase high performance liquid chromatography (RP-HPLC) method coupled to photodiode array detector (PDA) for the simultaneous determination of four phenolic substances; cuneataside D, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside, 3-caffeoylquinic acid and calceolarioside B. Particular attention was focused on the main compound, 3-caffeoylquinic acid, which has a range of biological functions. In addition, 2-(3,4-dihydroxyphenyl)ethyl-O-β-D-glucopyranoside was considered as a discernible marker of A. quinata from its easy confuse plants. The contents of compounds 2 and 3 ranged from 0.72 to 2.68 mg/g and from 1.66 to 5.64 mg/g, respectively. The validation data indicated that this HPLC/PDA assay was used successfully to quantify the four phenolic compounds in A. quinata from different locations using relatively simple conditions and procedures. The pattern-recognition analysis data from 53 samples classified them into two groups, allowing discrimination between A. quinata and comparable herbs. The results suggest that the established HPLC/PDA method is suitable for quantitation and pattern-recognition analyses for a quality evaluation of this medicinal herb.

  10. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem.

    Science.gov (United States)

    Huang, Honggang; Larsen, Martin R; Palmisano, Giuseppe; Dai, Jie; Lametsch, René

    2014-06-25

    Protein phosphorylation can regulate most of the important processes in muscle, such as metabolism and contraction. The postmortem (PM) metabolism and rigor mortis have essential effects on meat quality. In order to identify and characterize the protein phosphorylation events involved in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160 phosphoproteins with 784 phosphorylation sites. Among these, 184 phosphorylation sites on 93 proteins had their phosphorylation levels significantly changed. The proteins involved in glucose metabolism and muscle contraction were the two largest clusters of phosphoproteins with significantly changed phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant changes at the phosphorylation level but were relatively stable at the total protein level, suggesting that protein phosphorylation may have important roles in meat quality development through the regulation of proteins involved in glucose metabolism and muscle contraction, thereby affecting glycolysis and rigor mortis development in PM muscle. The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed the dimethyl labeling combined with the TiSH phosphopeptide enrichment and LC-MS/MS strategy. This was the first high-throughput quantitative phosphoproteomic study in PM muscle of farm animals. In the work, both the proteome

  11. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    International Nuclear Information System (INIS)

    Bai, Ling; Smuts, Jonathan; Walsh, Phillip; Qiu, Changling; McNair, Harold M.; Schug, Kevin A.

    2017-01-01

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  12. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Ling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); Smuts, Jonathan; Walsh, Phillip [VUV Analytics, Inc., Cedar Park, TX (United States); Qiu, Changling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); McNair, Harold M. [Department of Chemistry, Virginia Tech, Blacksburg, VA (United States); Schug, Kevin A., E-mail: kschug@uta.edu [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States)

    2017-02-08

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  13. Quantitative analysis of elastography images in the detection of breast cancer

    International Nuclear Information System (INIS)

    Landoni, V.; Francione, V.; Marzi, S.; Pasciuti, K.; Ferrante, F.; Saracca, E.; Pedrini, M.; Strigari, L.; Crecco, M.; Di Nallo, A.

    2012-01-01

    Purpose: The aim of this study was to develop a quantitative method for breast cancer diagnosis based on elastosonography images in order to reduce whenever possible unnecessary biopsies. The proposed method was validated by correlating the results of quantitative analysis with the diagnosis assessed by histopathologic exam. Material and methods: 109 images of breast lesions (50 benign and 59 malignant) were acquired with the traditional B-mode technique and with elastographic modality. Images in Digital Imaging and COmmunications in Medicine format (DICOM) were exported into a software, written in Visual Basic, especially developed to perform this study. The lesion was contoured and the mean grey value and softness inside the region of interest (ROI) were calculated. The correlations between variables were investigated and receiver operating characteristic (ROC) curve analysis was performed to assess the diagnostic accuracy of the proposed method. Pathologic results were used as standard reference. Results: Both the mean grey value and the softness inside the ROI resulted statistically different at the t test for the two populations of lesions (i.e., benign versus malignant): p < 0.0001. The area under the curve (AUC) was 0.924 (0.834–0.973) and 0.917 (0.826–0.970) for the mean grey value and for the softness respectively. Conclusions: Quantitative elastosonography is a promising ultrasound technique in the detection of breast cancer but large prospective trials are necessary to determine whether quantitative analysis of images can help to overcome some pitfalls of the methodic.

  14. B1 -sensitivity analysis of quantitative magnetization transfer imaging.

    Science.gov (United States)

    Boudreau, Mathieu; Stikov, Nikola; Pike, G Bruce

    2018-01-01

    To evaluate the sensitivity of quantitative magnetization transfer (qMT) fitted parameters to B 1 inaccuracies, focusing on the difference between two categories of T 1 mapping techniques: B 1 -independent and B 1 -dependent. The B 1 -sensitivity of qMT was investigated and compared using two T 1 measurement methods: inversion recovery (IR) (B 1 -independent) and variable flip angle (VFA), B 1 -dependent). The study was separated into four stages: 1) numerical simulations, 2) sensitivity analysis of the Z-spectra, 3) healthy subjects at 3T, and 4) comparison using three different B 1 imaging techniques. For typical B 1 variations in the brain at 3T (±30%), the simulations resulted in errors of the pool-size ratio (F) ranging from -3% to 7% for VFA, and -40% to > 100% for IR, agreeing with the Z-spectra sensitivity analysis. In healthy subjects, pooled whole-brain Pearson correlation coefficients for F (comparing measured double angle and nominal flip angle B 1 maps) were ρ = 0.97/0.81 for VFA/IR. This work describes the B 1 -sensitivity characteristics of qMT, demonstrating that it varies substantially on the B 1 -dependency of the T 1 mapping method. Particularly, the pool-size ratio is more robust against B 1 inaccuracies if VFA T 1 mapping is used, so much so that B 1 mapping could be omitted without substantially biasing F. Magn Reson Med 79:276-285, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Quantitative Gait Analysis in Patients with Huntington’s Disease

    Directory of Open Access Journals (Sweden)

    Seon Jong Pyo

    2017-09-01

    Full Text Available Objective Gait disturbance is the main factor contributing to a negative impact on quality of life in patients with Huntington’s disease (HD. Understanding gait features in patients with HD is essential for planning a successful gait strategy. The aim of this study was to investigate temporospatial gait parameters in patients with HD compared with healthy controls. Methods We investigated 7 patients with HD. Diagnosis was confirmed by genetic analysis, and patients were evaluated with the Unified Huntington’s Disease Rating Scale (UHDRS. Gait features were assessed with a gait analyzer. We compared the results of patients with HD to those of 7 age- and sex-matched normal controls. Results Step length and stride length were decreased and base of support was increased in the HD group compared to the control group. In addition, coefficients of variability for step and stride length were increased in the HD group. The HD group showed slower walking velocity, an increased stance/swing phase in the gait cycle and a decreased proportion of single support time compared to the control group. Cadence did not differ significantly between groups. Among the UHDRS subscores, total motor score and total behavior score were positively correlated with step length, and total behavior score was positively correlated with walking velocity in patients with HD. Conclusion Increased variability in step and stride length, slower walking velocity, increased stance phase, and decreased swing phase and single support time with preserved cadence suggest that HD gait patterns are slow, ataxic and ineffective. This study suggests that quantitative gait analysis is needed to assess gait problems in HD.

  16. High-performance liquid chromatographic quantitation of desmosine plus isodesmosine in elastin and whole tissue hydrolysates

    International Nuclear Information System (INIS)

    Soskel, N.T.

    1987-01-01

    Quantitation of desmosine and isodesmosine, the major crosslinks in elastin, has been of interest because of their uniqueness and use as markers of that protein. Accurate measurement of these crosslinks may allow determination of elastin degradation in vivo and elastin content in tissues, obviating lengthy extraction procedures. We have developed a method of quantitating desmosine plus isodesmosine in hydrolysates of tissue and insoluble elastin using high-performance liquid chromatographic separation and absorbance detection that is rapid (21-35 min) and sensitive (accurate linearity from 100 pmol to 5 nmol). This method has been used to quantitate desmosines in elastin from bovine nuchal ligament and lung and in whole aorta from hamsters. The ability to completely separate [ 3 H]lysine from desmosine plus isodesmosine allows the method to be used to study incorporation of lysine into crosslinks in elastin

  17. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    Science.gov (United States)

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (Panalysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  18. Performance measurement in transport sector analysis

    Directory of Open Access Journals (Sweden)

    M. Išoraitė

    2004-06-01

    Full Text Available The article analyses the following issues: 1. Performance measurement in literature. The performance measurement has an important role to play in the efficient and effective management of organizations. Kaplan and Johnson highlighted the failure of the financial measures to reflect changes in the competitive circumstances and strategies of modern organizations. Many authors have focused attention on how organizations can design more appropriate measurement systems. Based on literature, consultancy experience and action research, numerous processes have been developed that organizations can follow in order to design and implement systems. Many frameworks have been proposed that support these processes. The objective of such frameworks is to help organizations define a set of measures that reflect their objectives and assess their performance appropriately. 2. Transport sector performance and its impacts measuring. The purpose of transport measurement is to identify opportunities enhancing transport performance. Successful transport sector management requires a system to analyze its efficiency and effectiveness as well as plan interventions if transport sector performance needs improvement. Transport impacts must be measurable and monitorable so that the person responsible for the project intervention can decide when and how to influence them. Performance indicators provide a means to measure and monitor impacts. These indicators essentially reflect quantitative and qualitative aspects of impacts at given time and places. 3. Transport sector output and input. Transport sector inputs are the resources required to deliver transport sector outputs. Transport sector inputs are typically: human resources, particularly skilled resources (including specialists consulting inputs; technology processes such as equipment and work; and finance, both public and private. 4. Transport sector policy and institutional framework; 5. Cause – effect linkages; 6

  19. Network analysis of quantitative proteomics on asthmatic bronchi: effects of inhaled glucocorticoid treatment

    Directory of Open Access Journals (Sweden)

    Sihlbom Carina

    2011-09-01

    Full Text Available Abstract Background Proteomic studies of respiratory disorders have the potential to identify protein biomarkers for diagnosis and disease monitoring. Utilisation of sensitive quantitative proteomic methods creates opportunities to determine individual patient proteomes. The aim of the current study was to determine if quantitative proteomics of bronchial biopsies from asthmatics can distinguish relevant biological functions and whether inhaled glucocorticoid treatment affects these functions. Methods Endobronchial biopsies were taken from untreated asthmatic patients (n = 12 and healthy controls (n = 3. Asthmatic patients were randomised to double blind treatment with either placebo or budesonide (800 μg daily for 3 months and new biopsies were obtained. Proteins extracted from the biopsies were digested and analysed using isobaric tags for relative and absolute quantitation combined with a nanoLC-LTQ Orbitrap mass spectrometer. Spectra obtained were used to identify and quantify proteins. Pathways analysis was performed using Ingenuity Pathway Analysis to identify significant biological pathways in asthma and determine how the expression of these pathways was changed by treatment. Results More than 1800 proteins were identified and quantified in the bronchial biopsies of subjects. The pathway analysis revealed acute phase response signalling, cell-to-cell signalling and tissue development associations with proteins expressed in asthmatics compared to controls. The functions and pathways associated with placebo and budesonide treatment showed distinct differences, including the decreased association with acute phase proteins as a result of budesonide treatment compared to placebo. Conclusions Proteomic analysis of bronchial biopsy material can be used to identify and quantify proteins using highly sensitive technologies, without the need for pooling of samples from several patients. Distinct pathophysiological features of asthma can be

  20. Quantitative analysis of exercise 201Tl myocardial emission CT in patients with coronary artery disease

    International Nuclear Information System (INIS)

    Okada, Mitsuhiro; Kawai, Naoki; Yamamoto, Shuhei

    1984-01-01

    The clinical usefulness of quantitative analysis of exercise thallium-201 myocardial emission computed tomography (ECT) was evaluated in coronary artery disease (CAD). The subjects consisted of 20 CAD patients and five normal controls. All CAD patients underwent coronary angiography. Tomographic thallium-201 myocardial imaging was performed with a rotating gamma camera, and long-axial and short-axial myocardial images of the left ventricle were reconstructed. The tomographic images were interpreted quantitatively using circumferential profile analysis. Based on features of regional myocardial thallium-201 kinetics, two types of abnormalities were studied: (1) diminished initial distribution (stress defect) and (2) slow washout of thallium-201, as evidenced by patients' initial thallium-201 uptake and 3-hour washout rate profiles which fell below the normal limits, respectively. Two diagnostic criteria including the stress defect and a combination of the stress defect and slow washout were used to detect coronary artery lesions of significance (>=75 % luminal narrowing). The ischemic volumes were also evaluated by quantitative analysis using thallium-201 ECT. The diagnostic accuracy of the stress defect criterion was 95 % for left anterior descending, 90 % for right, and 70 % for left circumflex coronary artery lesions. The combined criteria of the stress defect and slow washout increased detection sensitivity with a moderate loss of specificity for identifying individual coronary artery lesion. A relatively high diagnostic accuracy was obtained using the stress defect criterion for multiple vessel disease (75 %). Ischemic myocardial volume was significantly larger in triple vessel than in single vessel disease (p < 0.05) using the combined criteria. It was concluded that quantitative analysis of exercise thallium-201 myocardial ECT images proves useful for evaluating coronary artery lesions. (author)

  1. Quantitative analysis of eyes and other optical systems in linear optics.

    Science.gov (United States)

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  2. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    Science.gov (United States)

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  3. Effects of SKF-83566 and haloperidol on performance on progressive ratio schedules maintained by sucrose and corn oil reinforcement: quantitative analysis using a new model derived from the Mathematical Principles of Reinforcement (MPR).

    Science.gov (United States)

    Olarte-Sánchez, C M; Valencia-Torres, L; Cassaday, H J; Bradshaw, C M; Szabadi, E

    2013-12-01

    Mathematical models can assist the interpretation of the effects of interventions on schedule-controlled behaviour and help to differentiate between processes that may be confounded in traditional performance measures such as response rate and the breakpoint in progressive ratio (PR) schedules. The effects of a D1-like dopamine receptor antagonist, 8-bromo-2,3,4,5-tetrahydro-3-methyl-5-phenyl-1H-3-benzazepin-7-ol hydrobromide (SKF-83566), and a D2-like receptor antagonist, haloperidol, on rats' performance on PR schedules maintained by sucrose and corn oil reinforcers were assessed using a new model derived from Killeen's (Behav Brain Sci 17:105-172, 1994) Mathematical Principles of Reinforcement. Separate groups of rats were trained under a PR schedule using sucrose or corn oil reinforcers. SKF-83566 (0.015 and 0.03 mg kg(-1)) and haloperidol (0.05 and 0.1 mg kg(-1)) were administered intraperitoneally (five administrations of each treatment). Running and overall response rates in successive ratios were analysed using the new model, and estimates of the model's parameters were compared between treatments. Haloperidol reduced a (the parameter expressing incentive value) in the case of both reinforcers, but did not affect the parameters related to response time and post-reinforcement pausing. SKF-83566 reduced a and k (the parameter expressing sensitivity of post-reinforcement pausing to the prior inter-reinforcement interval) in the case of sucrose, but did not affect any of the parameters in the case of corn oil. The results are consistent with the hypothesis that blockade of both D1-like and D2-like receptors reduces the incentive value of sucrose, whereas the incentive value of corn oil is more sensitive to blockade of D2-like than D1-like receptors.

  4. Quantitative analysis of phases by x-ray diffraction and thermogravimetry in Cuban phosphorite ores

    International Nuclear Information System (INIS)

    Casanova Gomez, Abdel; Martinez Montalvo, Asor; Cilano Campos, Guillermo; Arostegui Aguirre, Miladys; Ferreiro Fernandez, Adalyz; Alonso Perez, Jose A.

    2016-01-01

    Phases analysis is performed by instrumental techniques X - ray diffraction and Thermal Analysis in two groups of samples of Cuban minerals carriers'phosphorus, candidates to reference materials. To this end, the variant of structural refinement of the diffraction pattern in the form of adjustment profile is applied, using the Full prof program of Juan Rodriguez-Carvajal. This analysis is the first step to develop the standard specification of these resources and classify them as phosphate rock and / or phospharite from their mass content. The statistical evaluation of the uncertainty of the quantitative analysis (standard deviation) was carried out in ten replicate samples of phosphate rock and eight of phosphate from the field Trinidad de Guedes. The qualitative phase analysis reflected the following phase composition: carbonate fluoroapatite (CFA), Calcite, Quartz and Halloysite (present only in the clayey granular phosphorite ore; FGA). By the method of setting pattern powder diffraction profile, the quantitative phase composition is reported in the sample FGA: 87 (2) % of CFA, 4 (1) % of Calcite, 1% Quartz, and 8 (3) % Halloysite. For granular limestone ore (FGC), the following contents were obtained: 87 (3) % Calcite, 8 (3) % of CFA and 5 (1) % Quartz: The obtained values are corroborated by Thermogravimetric Analysis (TG) through the calculation of the mass content of the thermally active phases (Calcite and CFA) in the range (27-10000 0 C), confirming the validity of the results of XRD. (Author)

  5. Quantitation of bone mineral by dual photon absorptiometry (DPA): Evaluation of instrument performance

    International Nuclear Information System (INIS)

    Dunn, W.L.; O'Duffy, A.; Wahner, H.W.

    1984-01-01

    Quantitation of bone mineral is used with increasing frequency for clinical studies. This paper details the principle of DPA and present an evaluation of the technique. DPA measurements were performed with a scanning dual photon system constructed at this institution and modeled after the device developed at the University of Wisconsin. The components are a rectilinear scanner frame, 1.5 Ci Gd-153 source, NaI(TL) detector and a PDP 11/03 computer. Dual discriminator windows are set on the 44 and 100 keV photon energies of Gd-153. Instrument linearity, accuracy and reproducibility were evaluated with ashed bone standards and simulated tissue covering. In these experiments computed and actual bone mineral have a correlation coefficient of 1.0 and a SEE of approximately 1.0% (Linear regression analysis). Precision and accuracy of a standard were studied over a period of two years. Mean error between actual and measured bone mineral was 0.28%. In vivo precision in six subjects averaged 2.3% (CV) for lumbar spine measurements. The effect of soft tissue compositional change was studied with ashed bone standards and human cadaver spine specimens. Intraosseous fat changes of 50% produced an average bone mineral measurement error of 1.4%. A 20% change in fat thickness produced a 2.5% error. In situ and in vitro scans of 9 cadaver spines were performed to study the effect of extraosseous fat. The mean percent difference between the two measurements was 0.7% (SEE=3.2%)

  6. Scalable Performance Measurement and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, Todd [Univ. of North Carolina, Chapel Hill, NC (United States)

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  7. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  8. Quantitative Analysis of Radionuclide for the Used Resin of the Primary Purification System in HANARO

    International Nuclear Information System (INIS)

    Lee, Mun; Kim, Myong Seop; Park, Se Il; Kim, Tae Whan; Kim, Dong Hun; Kim, Young Chil

    2005-01-01

    In HANARO, a 30 MW research reactor, the ion exchange resin has been used for the purification of the primary coolant system. The resin used in the primary coolant purification system is replaced with new one once every 3 months during 30 MW reactor operation. The extracted resin from the primary coolant purification system is temporarily stored in a shielding treatment of the reactor hall for radiation cooling. After the radiation level of resin decreases enough to be handled for the waste disposal, it is put into the waste drum, and delivered to the waste facility in KAERI. Recently, in this procedure, the quantitative analysis of radionuclide which is contained in resin is required to have more quantitative data for the disposal. Therefore, in this work, a preliminary study was performed to find a sampling method for the representation of the characteristics of radionuclide in the spent resin

  9. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  10. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  11. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  12. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  13. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    Science.gov (United States)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  14. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  15. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  16. [Evaluation of dental plaque by quantitative digital image analysis system].

    Science.gov (United States)

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (Pchart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  17. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  18. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Science.gov (United States)

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  19. A QUANTITATIVE STUDY OF MARKET ORIENTATION AND ORGANIZATIONAL PERFORMANCE OF LISTED COMPANIES: EVIDENCE FROM GHANA

    OpenAIRE

    Solomon A. Keelson

    2012-01-01

    The study is part of a larger research of market orientation, which was conducted to build on previous research, and particularly examined the association between market orientation and business performance in a larger market context, using a synthesis model approach. Using the survey approach 24 companies out of 37 listed companies participated in the quantitative study; where 72 senior officials were surveyed from August 2011 to September 2011, through a five-likert scale questions. In this...

  20. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography

    International Nuclear Information System (INIS)

    Tappin, Marcelo R.R.; Pereira, Jislaine F.G.; Lima, Lucilene A.; Siani, Antonio C.; Mazzei, Jose L.; Ramos, Monica F.S.

    2004-01-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  1. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  2. Quantitative and qualitative analysis of semantic verbal fluency in patients with temporal lobe epilepsy.

    Science.gov (United States)

    Jaimes-Bautista, A G; Rodríguez-Camacho, M; Martínez-Juárez, I E; Rodríguez-Agudelo, Y

    2017-08-29

    Patients with temporal lobe epilepsy (TLE) perform poorly on semantic verbal fluency (SVF) tasks. Completing these tasks successfully involves multiple cognitive processes simultaneously. Therefore, quantitative analysis of SVF (number of correct words in one minute), conducted in most studies, has been found to be insufficient to identify cognitive dysfunction underlying SVF difficulties in TLE. To determine whether a sample of patients with TLE had SVF difficulties compared with a control group (CG), and to identify the cognitive components associated with SVF difficulties using quantitative and qualitative analysis. SVF was evaluated in 25 patients with TLE and 24 healthy controls; the semantic verbal fluency test included 5 semantic categories: animals, fruits, occupations, countries, and verbs. All 5 categories were analysed quantitatively (number of correct words per minute and interval of execution: 0-15, 16-30, 31-45, and 46-60seconds); the categories animals and fruits were also analysed qualitatively (clusters, cluster size, switches, perseverations, and intrusions). Patients generated fewer words for all categories and intervals and fewer clusters and switches for animals and fruits than the CG (Psize and number of intrusions and perseverations (P>.05). Our results suggest an association between SVF difficulties in TLE and difficulty activating semantic networks, impaired strategic search, and poor cognitive flexibility. Attention, inhibition, and working memory are preserved in these patients. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  4. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  5. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  6. Quantitative Analysis of Micro-CT Imaging and Histopathological Signatures of Experimental Arthritis in Rats

    Directory of Open Access Journals (Sweden)

    Matthew D. Silva

    2004-10-01

    Full Text Available Micro-computed tomographic (micro-CT imaging provides a unique opportunity to capture 3-D architectural information in bone samples. In this study of pathological joint changes in a rat model of adjuvant-induced arthritis (AA, quantitative analysis of bone volume and roughness were performed by micro-CT imaging and compared with histopathology methods and paw swelling measurement. Micro-CT imaging of excised rat hind paws (n = 10 stored in formalin consisted of approximately 600 30-μm slices acquired on a 512 × 512 image matrix with isotropic resolution. Following imaging, the joints were scored from H&E stained sections for cartilage/bone erosion, pannus development, inflammation, and synovial hyperplasia. From micro-CT images, quantitative analysis of absolute bone volumes and bone roughness was performed. Bone erosion in the rat AA model is substantial, leading to a significant decline in tarsal volume (27%. The result of the custom bone roughness measurement indicated a 55% increase in surface roughness. Histological and paw volume analyses also demonstrated severe arthritic disease as compared to controls. Statistical analyses indicate correlations among bone volume, roughness, histology, and paw volume. These data demonstrate that the destructive progression of disease in a rat AA model can be quantified using 3-D micro-CT image analysis, which allows assessment of arthritic disease status and efficacy of experimental therapeutic agents.

  7. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography

    International Nuclear Information System (INIS)

    Turmezei, Tom D.; Treece, Graham M.; Gee, Andrew H.; Fotiadou, Anastasia F.; Poole, Kenneth E.S.

    2016-01-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K and L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K and L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. (orig.)

  8. Quantitative determination of insulin entrapment efficiency in triblock copolymeric nanoparticles by high-performance liquid chromatography.

    Science.gov (United States)

    Xu, Xiongliang; Fu, Yao; Hu, Haiyan; Duan, Yourong; Zhang, Zhirong

    2006-04-11

    A rapid and effective isocratic chromatographic procedure was described in this paper for the determination of insulin entrapment efficiency (EE) in triblock copolymeric nanoparticles using reversed-phase high-performance liquid chromatography (RP-HPLC) with an ultraviolet/visible detector at low flow rate. The method has been developed on a Shimadzu Shim-pack VP-ODS column (150 mm x 4.6 mm, 5 microm, Chiyoda-Ku, Tokyo, Japan) using a mixture of 0.2 M sodium sulfate anhydrous solution adjusted to pH 2.3 with phosphoric acid and acetonitrile (73:27, v/v) as mobile phase at the flow rate of 0.8 ml min(-1) and a 214 nm detection. The method was validated in terms of selectivity, linearity, precision, accuracy, solution stability, limit of detection (LOD) and limit of quantification (LOQ). The calibration curve was linear in the concentration range of 2.0-500.0 microg ml(-1), and the limits of detection and quantitation were 8 and 20 ng, respectively. The mean recovery of insulin from spiked samples, in a concentration range of 8-100 microg ml(-1), was 98.96% (R.S.D.= 2.51%, n = 9). The intra- and inter-assay coefficients of variation were less than 2.24%. The proposed method has the advantages of simple pretreatment, rapid isolation, high specificity and precision, which can be used for direct analysis of insulin in commercially available raw materials, formulations of nanoparticles, and drug release as well as stability studies.

  9. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  10. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  11. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  12. Quantitative analysis of soluble elements in environmental waters by PIXE

    International Nuclear Information System (INIS)

    Niizeki, T.; Kawasaki, K.; Adachi, M.; Tsuji, M.; Hattori, T.

    1999-01-01

    We have started PIXE research for environmental science at Van de Graaff accelerator facility in Tokyo Institute of Technology. Quantitative measurements of soluble fractions in river waters have been carried out using the preconcentrate method developed in Tohoku University. We reveal that this PIXE target preparation can be also applied to waste water samples. (author)

  13. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...

  14. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  15. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    SAM

    2014-05-14

    May 14, 2014 ... African Journal of Biotechnology. Full Length ... quantitative trait locus (QTLs) on chromosomes 1, 6, 7 and 20 in ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs ...

  16. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  17. Quantitative analysis of biological fluids by electron probe and X ray spectrometry

    International Nuclear Information System (INIS)

    Girod, Chantal

    1986-01-01

    In order to know the kidney normal operation and to have an insight on cellular transport mechanisms and hormonal regulations at the nephron level, a technique based on the use of an electron probe has been developed for the elemental analysis of micro-volumes of biological fluids. This academic document reports applications of this technique on animals on which such fluids have been sampled at different levels of the nephron. As these samples are available in too small volumes to be dosed by conventional methods, they have been quantitatively analysed by using an electronic probe based analyser in order to determine concentrations of all elements with an atomic number greater than that of carbon. After a presentation of the implemented method and hardware, the author thus describes how an analysis is performed, and reports and discusses an example (analysis conditions, data acquisition, data processing, minimum detectable concentration, reasons for measurement scattering)

  18. Quantitative analysis of thorium-containing materials using an Industrial XRF analyzer

    International Nuclear Information System (INIS)

    Hasikova, J.; Titov, V.; Sokolov, A.

    2014-01-01

    Thorium (Th) as nuclear fuel is clean and safe and offers significant advantages over uranium. The technology for several types of thorium reactors is proven but still must be developed on a commercial scale. In the case of commercialization of thorium nuclear reactor thorium raw materials will be on demand. With this, mining and processing companies producing Th and rare earth elements will require prompt and reliable methods and instrumentation for Th quantitative on-line analysis. Potential applicability of X-ray fluorescence conveyor analyzer CON-X series is discussed for Th quantitative or semi-quantitative on-line measurement in several types of Th-bearing materials. Laboratory study of several minerals (zircon sands and limestone as unconventional Th resources; monazite concentrate as Th associated resources and uranium ore residues after extraction as a waste product) was performed and analyzer was tested for on-line quantitative measurements of Th contents along with other major and minor components. Th concentration range in zircon sand is 50-350 ppm; its detection limit at this level is estimated at 25- 50 ppm in 5 minute measurements depending on the type of material. On-site test of the CON-X analyzer for continuous analysis of thorium traces along with other elements in zircon sand showed that accuracy of Th measurements is within 20% relative. When Th content is higher than 1% as in the concentrate of monazite ore (5-8% ThO_2) accuracy of Th determination is within 1% relative. Although preliminary on-site test is recommended in order to address system feasibility at a large scale, provided results show that industrial conveyor XRF analyzer CON-X series can be effectively used for analytical control of mining and processing streams of Th-bearing materials. (author)

  19. Preliminary results of standard quantitative analysis by ED-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A., E-mail: alellara@hotmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Fisica; Denyak, Valeriy, E-mail: denyak@gmail.com [Instituto de Pesquisa Pele Pequeno Principe (IPPP), Curitiba, PR (Brazil)

    2013-07-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step.

  20. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  1. Emission tomography with positrons principle, physical performances of a ring detector and quantitative possibilities

    International Nuclear Information System (INIS)

    Soussaline, F.; Plummer, D.; Todd Pokropek, A.E.; Loc'h, C.; Comar, D.

    1979-01-01

    Satisfactory qualitative and quantitative data in positron emission tomography requires the use of a well adapted tomographic system. A number of parameters have been identified which can be considered as the critical characteristics for evaluation and intercomparison of such systems. Using these the choice of a single slice ring positron camera could be justified by its physical performance, which is presented and discussed. Series of physical and mathematical simulations allow an appropriate knowledge of such a system, which has been in use for more than a year in a clinical environment. These studies aid to the interpretation of very interesting physiopathologic studies. In principle, a positron tomographic system permits measurement of absolute quantitative concentration values, which are essential for precise metabolic studies. The main sources of error comprising the calibration of the system, the tail effects and the precision for attenuation correction are analysed. When taking in account these errors, a precision of the order of 10% should be obtainable [fr

  2. Quantitative analysis of the renal aging in rats. Stereological study.

    Science.gov (United States)

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins, Eduardo Lopes; Fraga, Rogério de

    2016-05-01

    To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glom]) of the renal glomeruli and average glomerular volume (Vol[glom])) and also it was evaluated the renal function for the dosage of serum creatinine and urea. There was significant decrease of the renal function in the oldest rats. The renal volume presented gradual increase during the development of the rats with the biggest values registered in the group of animals at 12 months of age and significant progressive decrease in older animals. Vv[glom] presented statistically significant gradual reduction between the groups and the Nv[glom] also decreased significantly. The renal function proved to be inferior in senile rats when compared to the young rats. The morphometric and stereological analysis evidenced renal atrophy, gradual reduction of the volume density and numerical density of the renal glomeruli associated to the aging process.

  3. Quantitative analysis of light elements in thick samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Jesus, A.P.; Ribeiro, J.P.

    2004-01-01

    PIGE analysis of thick and intermediate samples is usually performed with the help of standards, but this method gives only good results when the standard is very similar to the sample to be analysed. In this work, we present an alternative method for PIGE analysis of light elements in thick samples. This method is based on a code that integrates the nuclear reaction excitation function along the depth of the sample. For the integration procedure the sample is divided in sublayers, defined by the energy steps that were used to measure accurately the excitation function. This function is used as input. Within each sublayer the stopping power cross-sections may be assumed as constant. With these two conditions the calculus of the contribution of each sublayer for the total yield becomes an easy task. This work presents results for the analysis of lithium, boron, fluorine and sodium in thick samples. For this purpose, excitation functions of the reactions 7 Li(p,p ' γ) 7 Li, 19 F(p,p ' γ) 19 F, 10 B(p,αγ) 7 Be and 23 Na(p,p ' γ) 23 Na were employed. Calculated γ-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds of the referred elements. The agreement is better than 7.5%. Taking into consideration the experimental uncertainty of the measured yields and the errors related to the stopping power values used, this agreement shows that effects as the beam energy straggling, ignored in the calculation, seem to play a minor role

  4. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  5. Freight performance measures : approach analysis.

    Science.gov (United States)

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  6. MEASURING ORGANIZATIONAL CULTURE: A QUANTITATIVE-COMPARATIVE ANALYSIS [doi: 10.5329/RECADM.20100902007

    Directory of Open Access Journals (Sweden)

    Valderí de Castro Alcântara

    2010-11-01

    Full Text Available This article aims at the analysis of the organizational culture at enterprises located in two towns with distinct quantitative traits, Rio Paranaíba and Araxá. While the surveyed enterprises in Rio Paranaíba are mostly micro and small enterprises (86%, in Araxá there are mostly medium and large companies (53%. The overall objective is to verify if there are significant differences in organizational culture among these enterprises and if they can be explained by the organization size. The research was quantitative and instruments for data collection were a questionnaire and a scale for measuring organizational culture containing four dimensions: Hierarchical Distance Index (IDH, Individualism Index (INDI, Masculinity Index (MASC and the Uncertainty Control Index (CINC. Tabulation and analysis of data were performed using the PASW Statistics 18, doing descriptive and inferential statistical procedures. Using a Reduction Factor (-21 the achieved indexes were classified into 5 intensity categories (from "very low" to "very high". The Student t test for two means was performed, revealing significant differences in Hierarchical Distance and Individualism between Araxá and Rio Paranaíba enterprises (p <0.05.   Keywords Organizational Culture; Dimensions of Organizational Culture; Araxá; Rio Paranaíba.

  7. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    International Nuclear Information System (INIS)

    Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing

    2015-01-01

    Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC

  8. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  9. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  10. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  11. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  12. Quantitative analysis of Moessbauer backscatter spectra from multilayer films

    International Nuclear Information System (INIS)

    Bainbridge, J.

    1975-01-01

    The quantitative interpretation of Moessbauer backscatter spectra with particular reference to internal conversion electrons has been treated assuming that electron attenuation in a surface film can be satisfactorily described by a simple exponential law. The theory of Krakowski and Miller has been extended to include multi-layer samples, and a relation between the Moessbauer spectrum area and an individual layer thickness derived. As an example, numerical results are obtained for a duplex oxide film grown on pure iron. (Auth.)

  13. Geometrical conditions at the quantitative neutronographic texture analysis

    International Nuclear Information System (INIS)

    Tobisch, J.; Kleinstueck, K.

    1975-10-01

    The beam geometry for measuring quantitative pole figures by a neutronographic texture diffractometer is explained for transmission and reflection arrangement of spherical samples and sheets. For given dimensions of counter aperture the maximum possible cross sections of the incident beam are calculated as a function of sample dimensions and the Bragg angle theta. Methods for the calculation of absorption factors and volume correction are given. Under special conditions advantages result in the transmission case for sample motion into the direction +α. (author)

  14. Quantitative analysis of strategic and tactical purchasing decisions

    OpenAIRE

    Heijboer, G.J.

    2003-01-01

    Purchasing management is a relatively new scientific research field, partly due to the fact that purchasing has only recently been recognized as a factor of strategic importance to an organization. In this thesis, the author focuses on a selection of strategic and tactical purchasing decision problems. New quantitative models are developed for these decision problems using a range of mathematical techniques, thereby contributing to the further development of purchasing theory and its appliati...

  15. Quantitative analysis of carbon radiation in edge plasmas of LHD

    International Nuclear Information System (INIS)

    Dong, C.F.; Morita, S.; Oishi, T.; Goto, M.; Murakami, I.; Wang, E.R.; Huang, X.L.

    2013-01-01

    It is of interest to compare the carbon radiation loss between LHD and tokamaks. Since the radiation from C"3"+ is much smaller than that from C"5"+, it is also interesting to examine the difference in the detached plasma. In addition, it is important to study quantitatively the radiation from each ionization stage of carbon which is uniquely the dominant impurity in most tokamaks and LHD. (J.P.N.)

  16. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  17. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  18. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    Science.gov (United States)

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  19. Quantitative analysis of trivalent uranium and lanthanides in a molten chloride by absorption spectrophotometry

    International Nuclear Information System (INIS)

    Toshiyuki Fujii; Akihiro Uehara; Hajimu Yamana

    2013-01-01

    As an analytical application for pyrochemical reprocessing using molten salts, quantitative analysis of uranium and lanthanides by UV/Vis/NIR absorption spectrophotometry was performed. Electronic absorption spectra of LiCl-KCl eutectic at 773 K including trivalent uranium and eight rare earth elements (Y, La, Ce, Pr, Nd, Sm, Eu, and Gd as fission product elements) were measured in the wavenumber region of 4,500-33,000 cm -1 . The composition of the solutes was simulated for a reductive extraction condition in a pyroreprocessing process for spent nuclear fuels, that is, about 2 wt% U and 0.1-2 wt% rare earth elements. Since U(III) possesses strong absorption bands due to f-d transitions, an optical quartz cell with short light path length of 1 mm was adopted in the analysis. The quantitative analysis of trivalent U, Nd, Pr, and Sm was possible with their f-f transition intensities in the NIR region. The analytical results agree with the prepared concentrations within 2σ experimental uncertainties. (author)

  20. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  1. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  2. Hand Fatigue Analysis Using Quantitative Evaluation of Variability in Drawing Patterns

    Directory of Open Access Journals (Sweden)

    mohamadali Sanjari

    2015-02-01

    Full Text Available Background & aim: Muscle fatigue is defined as the reduced power generation capacity of a muscle or muscle group after activity which can lead to a variety of lesions. The purpose of the present study was to define the fatigue analysis by quantitative analysis using drawing patterns. Methods: the present cross-sectional study was conducted on 37 healthy volunteers (6 men and 31 women aged 18-30 years. Before & immediately after a fatigue protocol, quantitative assessment of hand drawing skills was performed by drawing repeated, overlapping, and concentric circles. The test was conducted in three sessions with an interval of 48-72 hours. Drawing was recorded by a digital tablet. Data were statistically analyzed using paired t-test and repeated measure ANOVA. Result: In drawing time series data analysis, at fatigue level of 100%, the variables standard deviation along x axis (SDx, standard deviation of velocity on both x and y axis (SDVx and SDVy and resultant vector velocity standard deviation (SDVR, showed significant differences after fatigue (P<0.05. In comparison of variables after the three fatigue levels, SDx showed significant difference (P<0.05. Conclusions: structurally full fatigue showed significant differences with other levels of fatigue, so it contributed to significant variability in drawing parameters. The method used in the present study recognized the fatigue in high frequency motion as well.

  3. Quantitative determination of reserpine, ajmaline, and ajmalicine in Rauvolfia serpentina by reversed-phase high-performance liquid chromatography.

    Science.gov (United States)

    Srivastava, A; Tripathi, A K; Pandey, R; Verma, R K; Gupta, M M

    2006-10-01

    A sensitive and reproducible reversed-phase high-performance liquid chromatography (HPLC) method using photodiode array detection is established for the simultaneous quantitation of important root alkaloids of Rauvolfia serpentina, namely, reserpine, ajmaline, and ajmalicine. A Chromolith Performance RP-18e column (100 x 4.6-mm i.d.) and a binary gradient mobile phase composed of 0.01 M (pH 3.5) phosphate buffer (NaH(2)PO(4)) containing 0.5% glacial acetic acid and acetonitrile are used. Analysis is run at a flow rate of 1.0 mL/min with the detector operated at a wavelength of 254 nm. The calibration curves are linear over a concentration range of 1-20 microg/mL (r = 1.000) for all the alkaloids. The various other aspects of analysis (i.e., peak purity, similarity, recovery, and repeatability) are also validated. For the three components, the recoveries are found to be 98.27%, 97.03%, and 98.38%, respectively. The limits of detection are 6, 4, and 8 microg/mL for ajmaline, ajmalicine, and reserpine, respectively, and the limits of quantitation are 19, 12, and 23 microg/mL for ajmaline, ajmalicine, and reserpine, respectively. The developed method is simple, reproducible, and easy to operate. It is useful for the evaluation of R. serpentina.

  4. Global tractography with embedded anatomical priors for quantitative connectivity analysis

    Directory of Open Access Journals (Sweden)

    Alia eLemkaddem

    2014-11-01

    Full Text Available The main assumption of fiber-tracking algorithms is that fiber trajectories are represented by paths of highest diffusion, which is usually accomplished by following the principal diffusion directions estimated in every voxel from the measured diffusion MRI data. The state-of-the-art approaches, known as global tractography, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The tractograms obtained with these algorithms outperform any previous technique but, unfortunately, the price to pay is an increased computational cost which is not suitable in many practical settings, both in terms of time and memory requirements. Furthermore, existing global tractography algorithms suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are used during in the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the white matter. This does not only unnecessarily slow down the estimation procedure and potentially biases any subsequent analysis but also, most importantly, prevents the de facto quantification of brain connectivity. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications by explicitly enforcing anatomical priors of the tracts in the optimization and considering the effective contribution of each of them, i.e. volume, to the acquired diffusion MRI image. We evaluated our approach on both a realistic diffusion MRI phantom and in-vivo data, and also compared its performance to existing tractography aloprithms.

  5. Qualitative and quantitative analyses of flavonoids in Spirodela polyrrhiza by high-performance liquid chromatography coupled with mass spectrometry.

    Science.gov (United States)

    Qiao, Xue; He, Wen-ni; Xiang, Cheng; Han, Jian; Wu, Li-jun; Guo, De-an; Ye, Min

    2011-01-01

    Spirodela polyrrhiza (L.) Schleid. is a traditional Chinese herbal medicine for the treatment of influenza. Despite its wide use in Chinese medicine, no report on quality control of this herb is available so far. To establish qualitative and quantitative analytical methods by high-performance liquid chromatography (HPLC) coupled with mass spectrometry (MS) for the quality control of S. polyrrhiza. The methanol extract of S. polyrrhiza was analysed by HPLC/ESI-MS(n). Flavonoids were identified by comparing with reference standards or according to their MS(n) (n = 2-4) fragmentation behaviours. Based on LC/MS data, a standardised HPLC fingerprint was established by analysing 15 batches of commercial herbal samples. Furthermore, quantitative analysis was conducted by determining five major flavonoids, namely luteolin 8-C-glucoside, apigenin 8-C-glucoside, luteolin 7-O-glucoside, apigenin 7-O-glucoside and luteolin. A total of 18 flavonoids were identified by LC/MS, and 14 of them were reported from this herb for the first time. The HPLC fingerprints contained 10 common peaks, and could differentiate good quality batches from counterfeits. The total contents of five major flavonoids in S. polyrrhiza varied significantly from 4.28 to 19.87 mg/g. Qualitative LC/MS and quantitative HPLC analytical methods were established for the comprehensive quality control of S. polyrrhiza. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Rapid Quantitation of Furanocoumarins and Flavonoids in Grapefruit Juice using Ultra-Performance Liquid Chromatography.

    Science.gov (United States)

    Vandermolen, Karen M; Cech, Nadja B; Paine, Mary F; Oberlies, Nicholas H

    2013-01-01

    Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6',7'-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin and hesperidin) in five grapefruit juice products using ultra-performance liquid chromatography (UPLC). Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analysed by UPLC using acetonitrile:water gradients and a C18 -column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Rapid (0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  8. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  9. Quantitative analysis of gait in the visually impaired.

    Science.gov (United States)

    Nakamura, T

    1997-05-01

    In this comparative study concerning characteristics of independent walking by visually impaired persons, we used a motion analyser system to perform gait analysis of 15 late blind (age 36-54, mean 44.3 years), 15 congenitally blind (age 39-48, mean 43.8 years) and 15 sighted persons (age 40-50, mean 44.4 years) while walking a 10-m walkway. All subjects were male. Compared to the sighted, late blind and congenitally blind persons had a significantly slower walking speed, shorter stride length and longer time in the stance phase of gait. However, the relationships between gait parameters in the late and congenitally blind groups were maintained, as in the sighted group. In addition, the gait of the late blind showed a tendency to approximate the gait patterns of the congenitally blind as the duration of visual loss progressed. Based on these results we concluded that the gait of visually impaired persons, through its active use of non-visual sensory input, represents an attempt to adapt to various environmental conditions in order to maintain a more stable posture and to effect safe walking.

  10. Quantitative Analysis of Criteria in University Building Maintenance in Malaysia

    Directory of Open Access Journals (Sweden)

    Olanrewaju Ashola Abdul-Lateef

    2010-10-01

    Full Text Available University buildings are a significant part of university assets and considerable resources are committed to their design, construction and maintenance. The core of maintenance management is to optimize productivity and user satisfaction with optimum resources. An important segment in the maintenance management system is the analysis of criteria that influence building maintenance. Therefore, this paper aims to identify quantify, rank and discuss the criteria that influence maintenance costs, maintenance backlogs, productivity and user satisfaction in Malaysian university buildings. The paper reviews the related literature and presents the outcomes of a questionnaire survey. Questionnaires were administered on 50 university maintenance organizations. Thirty-one criteria were addressed to the university maintenance organizations to evaluate the degree to which each of the criteria influences building maintenance management. With a 66% response rate, it was concluded that the consideration of the criteria is critical to the university building maintenance management system. The quality of components and materials, budget constraints and the age of the building were found to be the most influential criteria but information on user performance satisfaction, problems associated with in-house workforce and shortage of materials and components were the least influential criteria. The paper also outlined that maintenance management is a strategic function in university administration.

  11. Quantitative analysis of the individual dynamics of Psychology theses

    Directory of Open Access Journals (Sweden)

    Robles, Jaime R.

    2009-12-01

    Full Text Available Three cohorts of undergraduate psychology theses (n = 57 performed by last year undergraduate psychology students from Universidad Católica Andrés Bello, were monitored using 5 longitudinal measurements of progression. A Generalized Additive Model, to predict the completion time of the theses, is tested against two completion times: early and delayed. Effect size measures favor a multiple dimension model over a global progress model. The trajectory of the indicators through the 5 measurements allows the differentiation between early and delayed completion. The completion probabilities estimated by the dimensional model allow the identification of differential oscillation levels for the distinct completion times. The initial progression indicators allow the prediction of early completion with a 71% success rate, while the final measurement shows a success rate of 89%. The results support the effectiveness of the supervisory system and the analysis of the progression dynamics of the theses from a task-delay model, focused on the relationship between the amount of task completion and the deadlines

  12. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    International Nuclear Information System (INIS)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H.

    1998-01-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size≥0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions

  13. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H. [Gachon Medical College, Gil Medical Center, Inchon (Korea, Republic of)

    1998-07-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size{>=}0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions.

  14. Quantitative analysis of food and feed samples with droplet digital PCR.

    Directory of Open Access Journals (Sweden)

    Dany Morisset

    Full Text Available In this study, the applicability of droplet digital PCR (ddPCR for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs. Real-time quantitative polymerase chain reaction (qPCR is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  15. Analytical performance of refractometry in quantitative estimation of isotopic concentration of heavy water in nuclear reactor

    International Nuclear Information System (INIS)

    Dhole, K.; Ghosh, S.; Datta, A.; Tripathy, M.K.; Bose, H.; Roy, M.; Tyagi, A.K.

    2011-01-01

    The method of refractometry has been investigated for the quantitative estimation of isotopic concentration of D 2 O (heavy water) in a simulated water sample. Viability of Refractometry as an excellent analytical technique for rapid and non-invasive determination of D 2 O concentration in water samples has been demonstrated. Temperature of the samples was precisely controlled to eliminate effect of temperature fluctuation on refractive index measurement. Calibration performance by this technique exhibited reasonable analytical response over a wide range (1-100%) of D 2 O concentration. (author)

  16. Isolation and quantitation of metallothionein isoforms using reversed-phase high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Richards, M.P.; Darcey, S.E.; Steele, N.C.

    1986-01-01

    Reversed-phase HPLC (RP-HPLC) was used to isolate and quantify metallothionein (MT) isoforms from a variety of animal species and tissues. Separations were performed on C 18 radially compressed cartridge columns, eluted with a 2-step linear gradient of acetonitrile in 10 mM sodium phosphate, pH 7.0. Isoforms were detected by UV absorbance (214 nm) and by on-line interfacing with an atomic absorption spectrophotometer (HPLC-AA) to determine bound Zn, Cd and Cu. Rabbit liver and horse kidney MT's exhibited 7 distinct peaks on RP-HPLC, 2 of which were predominant (MT1 and 2). Pig liver and kidney MT2 yielded 2 subspecies on RP-HPLC, while MT1 yielded a single peak. Avian liver MT was unique from mammalian MT's in that MT2 was about tenfold more abundant than MT1. RP-HPLC and HPLC-AA were used to isolate and quantitate MT isoforms and their Zn content directly from cytosol. Quantitation was achieved by peak area integration and extrapolation from a standard curve of purified avian liver MT2. Both RP-HPLC and HPLC-AA had a lower detection limit of 1 + g of peptide and .1 μg of Zn. Recoveries (92-98%) were determined with labeled ( 35 S) MT and MT of known Zn content. Cytoplasmic MT-Zn in avian embryo hepatocytes cultured with added Zn was quantitated using HPLC-AA. In conclusion, both RP-HPLC and HPLC-AA are rapid and powerful separation techniques for the isolation, quantitation and characterization of the isoproteins comprising the MT gene family

  17. Quantitative vegetation reconstruction from pollen analysis and historical inventory data around a Danish small forest hollow

    DEFF Research Database (Denmark)

    Overballe-Petersen, Mette V; Nielsen, Anne Birgitte; Bradshaw, Richard H.W.

    2013-01-01

    of the pollen record? Location Denmark. The Gribskov-Ostrup small forest hollow (56°N, 12°20' E, 44 m a.s.l.) in the forest of Gribskov, eastern Denmark. Methods Pollen analysis was carried out on a small forest hollow, and LRA used to derive pollen-based quantitative estimates of past vegetation. Historical......Questions Can the model performance of the landscape reconstruction algorithm (LRA) for small forest hollows be validated through comparison to inventory-based vegetation reconstructions from the last 150 yrs? Does the application of LRA and the comparison to historical data enhance interpretation...... forest inventory data and maps were used to reconstruct the vegetation within three different circles around the hollow (20, 50 and 200 m ring widths) for five time periods during the last 150 yrs. The results of the two approaches were compared in order to evaluate model performance, and the LRA...

  18. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  19. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  20. Using quantitative image analysis to classify axillary lymph nodes on breast MRI: A new application for the Z 0011 Era

    Energy Technology Data Exchange (ETDEWEB)

    Schacht, David V., E-mail: dschacht@radiology.bsd.uchicago.edu; Drukker, Karen, E-mail: kdrukker@uchicago.edu; Pak, Iris, E-mail: irisgpak@gmail.com; Abe, Hiroyuki, E-mail: habe@radiology.bsd.uchicago.edu; Giger, Maryellen L., E-mail: m-giger@uchicago.edu

    2015-03-15

    Highlights: •Quantitative image analysis showed promise in evaluating axillary lymph nodes. •13 of 28 features performed better than guessing at metastatic status. •When all features were used in together, a considerably higher AUC was obtained. -- Abstract: Purpose: To assess the performance of computer extracted feature analysis of dynamic contrast enhanced (DCE) magnetic resonance images (MRI) of axillary lymph nodes. To determine which quantitative features best predict nodal metastasis. Methods: This institutional board-approved HIPAA compliant study, in which informed patient consent was waived, collected enhanced T1 images of the axilla from patients with breast cancer. Lesion segmentation and feature analysis were performed on 192 nodes using a laboratory-developed quantitative image analysis (QIA) workstation. The importance of 28 features were assessed. Classification used the features as input to a neural net classifier in a leave-one-case-out cross-validation and evaluated with receiver operating characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) values for features in the task of distinguishing between positive and negative nodes ranged from just over 0.50 to 0.70. Five features yielded AUCs greater than 0.65: two morphological and three textural features. In cross-validation, the neural net classifier obtained an AUC of 0.88 (SE 0.03) for the task of distinguishing between positive and negative nodes. Conclusion: QIA of DCE MRI demonstrated promising performance in discriminating between positive and negative axillary nodes.

  1. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    International Nuclear Information System (INIS)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi; Somerville, Robert A.; Kitamoto, Tetsuyuki; Mohri, Shirou

    2013-01-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products

  2. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan); Somerville, Robert A. [The Roslin Institute and Royal (Dick) School of Veterinary Studies, Roslin, Midlothian, EH25 9PS (United Kingdom); Kitamoto, Tetsuyuki [Division of CJD Science and Technology, Department of Prion Research, Center for Translational and Advanced Animal Research on Human Diseases, Tohoku University Graduate School of Medicine, 2-1 Seiryo, Aoba, Sendai 980-8575 (Japan); Mohri, Shirou, E-mail: shirou@affrc.go.jp [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan)

    2013-03-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products.

  3. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  4. Quantitative analysis of abused drugs in physiological fluids by gas chromatography/chemical ionization mass spectrometry

    International Nuclear Information System (INIS)

    Foltz, R.L.

    1978-01-01

    Methods have been developed for quantitative analysis of commonly abused drugs in physiological fluids using gas chromatography/chemical ionization mass spectrometry. The methods are being evaluated in volunteer analytical and toxicological laboratories, and analytical manuals describing the methods are being prepared. The specific drug and metabolites included in this program are: Δ 9 -tetrahydrocannabinol, methadone, phencyclidine, methaqualone, morphine, amphetamine, methamphetamine, mescaline, 2,5-dimethoxy-4-methyl amphetamine, cocaine, benzoylecgonine, diazepam, and N-desmethyldiazepam. The current analytical methods utilize relatively conventional instrumentation and procedures, and are capable of measuring drug concentrations as low as 1 ng/ml. Various newer techniques such as sample clean-up by high performance liquid chromatography, separation by glass capillary chromatography, and ionization by negative ion chemical ionization are being investigated with respect to their potential for achieving higher sensitivity and specificity, as well as their ability to facilitate simultaneous analysis of more than one drug and metabolite. (Auth.)

  5. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    problem was the basis for an inverse problem, i.e. revealing depth of a body occurrence, its location in a space as well as determining physical properties. At the same time, this method has not received a wide practical application taking into account complexity of real geological media. Careful analysis piezo- and seismoelectric anomalies shows the possibility of application of quantitative analysis of these effects advanced methodologies developed in magnetic prospecting for complex physical-geological conditions (Eppelbaum et al., 2000, 2001, 2010; Eppelbaum, 2010; 2011, 2015). Employment of these methodologies (improved modifications of tangents, characteristic points areal methods) for obtaining quantitative characteristics of ore bodies, environmental features and archaeological targets (models of horizontal circular cylinder, sphere, thin bed, thick bed and thin horizontal plate were utilized) have demonstrated their effectiveness. Case study at the archaeological site Tel Kara Hadid Field piezoelectric observations were conducted at the ancient archaeological site Tel Kara Hadid with gold-quartz mineralization in southern Israel within the Precambrian terrain at the northern extension of the Arabian-Nubian Shield (Neishtadt et al., 2006). The area of the archaeological site is located eight kilometers north of the town of Eilat, in an area of strong industrial noise. Ancient river alluvial terraces (extremely heterogeneous at a local scale, varying from boulders to silt) cover the quartz veins and complicate their identification. Piezoelectric measurements conducted over a quartz vein covered by surface sediments (approximately of 0.4 m thickness) produced a sharp (500 μV ) piezoelectric anomaly. Values recorded over the host rocks (clays and shales of basic composition) were close to zero. The observed piezoelectric anomaly was successfully interpreted by the use of methodologies developed in magnetic prospecting. For effective integration of piezo- and

  6. Quantitative analysis of water heavy by NMR spectroscopy

    International Nuclear Information System (INIS)

    Gomez Gil, V.

    1975-01-01

    Nuclear Magnetic Resonance has been applied to a wide variety of quantitative problems. A typical example has been the determination of isotopic composition. In this paper two different analytical methods for the determination of water in deuterium oxide are described. The first one, employs acetonitril as an internal standard compound and in the second one calibration curve of signal integral curve versus amount of D 2 O is constructed. Both methods give results comparable to those of mass spectrometry of IR spectroscopy. (Author) 5 refs

  7. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  8. Risk management and analysis: risk assessment (qualitative and quantitative)

    OpenAIRE

    Valentin Mazareanu

    2007-01-01

    We use to define risk as the possibility of suffering a loss. Starting this, risk management is defined as a business process whose purpose is to ensure that the organization is protected against risks and their effects. In order to prioritize, to develop a response plan and after that to monitor the identified risks we need to asses them. But at this point a question is born: should I choose a qualitative approach or a quantitative one? This paper will make a short overview over the risk eva...

  9. Correlation of quantitative histopathological morphology and quantitative radiological analysis during aseptic loosening of hip endoprostheses.

    Science.gov (United States)

    Bertz, S; Kriegsmann, J; Eckardt, A; Delank, K-S; Drees, P; Hansen, T; Otto, M

    2006-01-01

    Aseptic hip prosthesis loosening is the most important long-term complication in total hip arthroplasty. Polyethylene (PE) wear is the dominant etiologic factor in aseptic loosening, which together with other factors induces mechanisms resulting in bone loss, and finally in implant loosening. The single-shot radiograph analysis (EBRA, abbreviation for the German term "Einzel-Bild-Röntgenanalyse") is a computerized method for early radiological prediction of aseptic loosening. In this study, EBRA parameters were correlated with histomorphological parameters of the periprosthetic membrane. Periprosthetic membranes obtained from 19 patients during revision surgery of loosened ABG I-type total hip pros-theses were analyzed histologically and morphometrically. The pre-existing EBRA parameters, the thickness of the PE debris lay-er and the dimension of inclination and anteversion, were compared with the density of macrophages and giant cells. Addi-tionally, the semiquantitatively determined density of lymphocytes, plasma cells, giant cells and the size of the necrotic areas were correlated with the EBRA results. All periprosthetic membranes were classified as debris-induced type membranes. We found a positive correlation between the number of giant cells and the thickness of the PE debris layer. There was no significant correlation between the number of macrophages or all semiquantitative parameters and EBRA parameters. The number of giant cells decreased with implant duration. The morphometrically measured number of foreign body giant cells more closely reflects the results of the EBRA. The semiquantitative estimation of giant cell density could not substitute for the morphometrical analysis. The density of macrophages, lymphocytes, plasma cells and the size of necrotic areas did not correlate with the EBRA parameters, indicating that there is no correlation with aseptic loosening.

  10. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  11. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  12. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  13. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  14. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  15. A quantitative measure for degree of automation and its relation to system performance and mental load.

    Science.gov (United States)

    Wei, Z G; Macwan, A P; Wieringa, P A

    1998-06-01

    In this paper we quantitatively model degree of automation (DofA) in supervisory control as a function of the number and nature of tasks to be performed by the operator and automation. This model uses a task weighting scheme in which weighting factors are obtained from task demand load, task mental load, and task effect on system performance. The computation of DofA is demonstrated using an experimental system. Based on controlled experiments using operators, analyses of the task effect on system performance, the prediction and assessment of task demand load, and the prediction of mental load were performed. Each experiment had a different DofA. The effect of a change in DofA on system performance and mental load was investigated. It was found that system performance became less sensitive to changes in DofA at higher levels of DofA. The experimental data showed that when the operator controlled a partly automated system, perceived mental load could be predicted from the task mental load for each task component, as calculated by analyzing a situation in which all tasks were manually controlled. Actual or potential applications of this research include a methodology to balance and optimize the automation of complex industrial systems.

  16. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    Science.gov (United States)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  17. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    O' Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    2017-05-01

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.

  18. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance.

    Science.gov (United States)

    O'Daniel, Jennifer C; Yin, Fang-Fang

    2017-05-01

    To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Although the failure severity was greatest for daily imaging QA (imaging vs treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    Science.gov (United States)

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  20. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.