WorldWideScience

Sample records for quantitative analysis method

  1. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  2. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  3. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  4. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  5. General method of quantitative spectrographic analysis

    International Nuclear Information System (INIS)

    Capdevila, C.; Roca, M.

    1966-01-01

    A spectrographic method was developed to determine 23 elements in a wide range of concentrations; the method can be applied to metallic or refractory samples. Previous melting with lithium tetraborate and germanium oxide is done in order to avoid the influence of matrix composition and crystalline structure. Germanium oxide is also employed as internal standard. The resulting beads ar mixed with graphite powder (1:1) and excited in a 10 amperes direct current arc. (Author) 12 refs

  6. Analysis of methods for quantitative renography

    International Nuclear Information System (INIS)

    Archambaud, F.; Maksud, P.; Prigent, A.; Perrin-Fayolle, O.

    1995-01-01

    This article reviews the main methods using renography to estimate renal perfusion indices and to quantify differential and global renal function. The review addresses the pathophysiological significance of estimated parameters according to the underlying models and the choice of the radiopharmaceutical. The dependence of these parameters on the region of interest characteristics and on the methods of background and attenuation corrections are surveyed. Some current recommendations are proposed. (authors). 66 refs., 8 figs

  7. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  9. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  10. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  13. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  14. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  15. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  16. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  17. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  20. Synchrotron radiation microprobe quantitative analysis method for biomedical specimens

    International Nuclear Information System (INIS)

    Xu Qing; Shao Hanru

    1994-01-01

    Relative changes of trace elemental content in biomedical specimens are obtained easily by means of synchrotron radiation X-ray fluorescence microprobe analysis (SXRFM). However, the accurate assignment of concentration on a g/g basis is difficult. Because it is necessary to know both the trace elemental content and the specimen mass in the irradiated volume simultaneously. the specimen mass is a function of the spatial position and can not be weighed. It is possible to measure the specimen mass indirectly by measuring the intensity of Compton scattered peak for normal XRF analysis using a X-ray tube with Mo anode, if the matrix was consisted of light elements and the specimen was a thin sample. The Compton peak is not presented in fluorescence spectrum for white light SXRFM analysis. The continuous background in the spectrum was resulted from the Compton scattering with a linear polarization X-ray source. Biomedical specimens for SXRFM analysis, for example biological section and human hair, are always a thin sample for high energy X-ray, and they consist of H,C,N and O etc. light elements, which implies a linear relationship between the specimen mass and the Compton scattering background in the high energy region of spectrum. By this way , it is possible to carry out measurement of concentration for SXRFM analysis

  1. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  2. A quantitative method for Failure Mode and Effects Analysis

    NARCIS (Netherlands)

    Braaksma, Anne Johannes Jan; Meesters, A.J.; Klingenberg, W.; Hicks, C.

    2012-01-01

    Failure Mode and Effects Analysis (FMEA) is commonly used for designing maintenance routines by analysing potential failures, predicting their effect and facilitating preventive action. It is used to make decisions on operational and capital expenditure. The literature has reported that despite its

  3. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  4. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  5. Quantitative Analysis of Range Image Patches by NEB Method

    Directory of Open Access Journals (Sweden)

    Wang Wen

    2017-01-01

    Full Text Available In this paper we analyze sampled high dimensional data with the NEB method from a range image database. Select a large random sample of log-valued, high contrast, normalized, 8×8 range image patches from the Brown database. We make a density estimator and we establish 1-dimensional cell complexes from the range image patch data. We find topological properties of 8×8 range image patches, prove that there exist two types of subsets of 8×8 range image patches modelled as a circle.

  6. Methods of quantitative analysis of nuclear energy hazards

    International Nuclear Information System (INIS)

    Papp, R.; Caldarola, L.; Helm, F.; Jansen, P.; McGrath, P.; Weber, G.

    1975-03-01

    Risk can be defined as the sum of all possible damage types weighted with their associated cumulative probability distributions of occurrence. Risk defined in this manner is not very suitable for comparison purposes. In order to be able to synthetically express the risk by means of a single parameter, two problems must be solved: 1) For each damage type an average value must be calculated which accounts not only for the occurence probability distribution but also for the degree and importance of the damage to human society. 2) The total average value (the risk) must be calculated by weighting each average damage type with a corresponding second importance function which represents the importance and acceptability of the particular damage type to human society. Here it must be pointed out that the above mentioned problems are directly connected to the problem of 'risk acceptance', which will be as well discussed as the risk associated with the entire nuclear fuel cycle. Finally, recommendations for further research work are given in section V which are thought to be needed in order to render these methods in the near future more generally applicable and accepted than they are today. (orig./RW) [de

  7. Quantitative phase analysis of uranium carbide from x-ray diffraction data using the Rietveld method

    International Nuclear Information System (INIS)

    Singh Mudher, K.D.; Krishnan, K.

    2003-01-01

    Quantitative phase analysis of a uranium carbide sample was carried out from the x-ray diffraction data by Rietveld profile fitting method. The method does not require the addition of any reference material. The percentage of UC, UC 2 and UO 2 phases in the sample were determined. (author)

  8. Quantitative analysis of iodine in thyroidin. I. Methods of ''dry'' and ''wet'' mineralization

    International Nuclear Information System (INIS)

    Listov, S.A.; Arzamastsev, A.P.

    1986-01-01

    The relative investigations on the quantitative determination of iodine in thyroidin using different modifications of the ''dry'' and ''wet'' mineralization show that in using these methods the difficulties due to the characteristic features of the object of investigation itself and the mineralization method as a whole must be taken into account. The studies show that the most applicable method for the analysis of thyroidin is the method of ''dry'' mineralization with potassium carbonate. A procedure is proposed for a quantitative determination of iodine in thyroidin

  9. A Simple Linear Regression Method for Quantitative Trait Loci Linkage Analysis With Censored Observations

    OpenAIRE

    Anderson, Carl A.; McRae, Allan F.; Visscher, Peter M.

    2006-01-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using...

  10. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  11. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    NARCIS (Netherlands)

    Bosschaart, Nienke; van Leeuwen, Ton; Aalders, Maurice C.G.; Faber, Dirk

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of

  12. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.

  13. Advantages of a Dynamic RGGG Method in Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2009-01-01

    Various researches have been conducted in order to analyze dynamic interactions among components and process variables in nuclear power plants which cannot be handled by static reliability analysis methods such as conventional fault tree and event tree techniques. A dynamic reliability graph with general gates (RGGG) method was proposed for an intuitive modeling of dynamic systems and it enables one to easily analyze huge and complex systems. In this paper, advantages of the dynamic RGGG method are assessed through two stages: system modeling and quantitative analysis. And then a software tool for dynamic RGGG method is introduced and an application to a real dynamic system is accompanied

  14. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  15. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  16. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  17. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  18. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  19. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  20. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  1. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  2. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  3. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  4. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  5. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  6. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  7. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    Science.gov (United States)

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  8. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  9. Quantitative methods of data analysis for the physical sciences and engineering

    CERN Document Server

    Martinson, Douglas G

    2018-01-01

    This book provides thorough and comprehensive coverage of most of the new and important quantitative methods of data analysis for graduate students and practitioners. In recent years, data analysis methods have exploded alongside advanced computing power, and it is critical to understand such methods to get the most out of data, and to extract signal from noise. The book excels in explaining difficult concepts through simple explanations and detailed explanatory illustrations. Most unique is the focus on confidence limits for power spectra and their proper interpretation, something rare or completely missing in other books. Likewise, there is a thorough discussion of how to assess uncertainty via use of Expectancy, and the easy to apply and understand Bootstrap method. The book is written so that descriptions of each method are as self-contained as possible. Many examples are presented to clarify interpretations, as are user tips in highlighted boxes.

  10. A quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    International Nuclear Information System (INIS)

    Miyanaga, Hajime

    1982-01-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes. (J.P.N.)

  11. Quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    Energy Technology Data Exchange (ETDEWEB)

    Miyanaga, Hajime [Kyoto Prefectural Univ. of Medicine (Japan)

    1982-08-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes.

  12. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  13. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  14. Full quantitative phase analysis of hydrated lime using the Rietveld method

    International Nuclear Information System (INIS)

    Lassinantti Gualtieri, Magdalena; Romagnoli, Marcello; Miselli, Paola; Cannio, Maria; Gualtieri, Alessandro F.

    2012-01-01

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2–15 wt.%.

  15. Full quantitative phase analysis of hydrated lime using the Rietveld method

    Energy Technology Data Exchange (ETDEWEB)

    Lassinantti Gualtieri, Magdalena, E-mail: magdalena.gualtieri@unimore.it [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Romagnoli, Marcello; Miselli, Paola; Cannio, Maria [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Gualtieri, Alessandro F. [Dipartimento di Scienze della Terra, Universita Degli Studi di Modena e Reggio Emilia, I-41100 Modena (Italy)

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  16. Study of the quantitative analysis approach of maintenance by the Monte Carlo simulation method

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    2007-01-01

    This study is examination of the quantitative valuation by Monte Carlo simulation method of maintenance activities of a nuclear power plant. Therefore, the concept of the quantitative valuation of maintenance that examination was advanced in the Japan Society of Maintenology and International Institute of Universality (IUU) was arranged. Basis examination for quantitative valuation of maintenance was carried out at simple feed water system, by Monte Carlo simulation method. (author)

  17. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    Science.gov (United States)

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  18. Assessment of a synchrotron X-ray method for quantitative analysis of calcium hydroxide

    International Nuclear Information System (INIS)

    Williams, P. Jason; Biernacki, Joseph J.; Bai Jianming; Rawn, Claudia J.

    2003-01-01

    Thermogravimetric analysis (TGA) and quantitative X-ray diffraction (QXRD) are widely used to determine the calcium hydroxide (CH) content in cementitious systems containing blends of Portland cement, fly ash, blast furnace slag, silica fume and other pozzolanic and hydraulic materials. These techniques, however, are destructive to cement samples and subject to various forms of error. While precise weight losses can be measured by TGA, extracting information from samples with multiple overlapping thermal events is difficult. And, however, while QXRD can offer easier deconvolution, the accuracy for components below about 5 wt.% is typically poor when a laboratory X-ray source is used. Furthermore, the destructive nature of both techniques prevents using them to study the in situ hydration of a single contiguous sample for kinetic analysis. In an attempt to overcome these problems, the present research evaluated the use of synchrotron X-rays for quantitative analysis of CH. A synchrotron X-ray source was used to develop calibration data for quantification of the amount of CH in mixtures with fly ash. These data were compared to conventional laboratory XRD data for like samples. While both methods were found to offer good quantification, synchrotron XRD (SXRD) provided a broader range of detectability and higher accuracy than laboratory diffraction and removed the subjectivity as compared to TGA analysis. Further, the sealed glass capillaries used with the synchrotron source provided a nondestructive closed, in situ environment for tracking hydrating specimens from zero to any desired age

  19. A method for the quantitative analysis of heavy elements by X-ray fluorescence

    International Nuclear Information System (INIS)

    Souza Caillaux, Z. de

    1981-01-01

    A study of quantitative analysis methodology by X-ray fluorescence analysis is presented. With no damage to precision it makes possible an analysis of heavy elements in samples with the form and texture as they present themselves. Some binary alloys were examined such as: FeCo; CuNi; CuZn; AgCd; AgPd; AuPt e PtIr. The possibility of application of this method is based on the compromise solutIon of wave lengths and the intensity of the homologous emission and absorption edges of constituents with the quantic efficiency of the detector, the dispersion and the wave lenght resolution of crystal analyser, and the uniformity of the excitation intensity. (Author) [pt

  20. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%

  1. Quantitative analysis of patients with celiac disease by video capsule endoscopy: A deep learning method.

    Science.gov (United States)

    Zhou, Teng; Han, Guoqiang; Li, Bing Nan; Lin, Zhizhe; Ciaccio, Edward J; Green, Peter H; Qin, Jing

    2017-06-01

    Celiac disease is one of the most common diseases in the world. Capsule endoscopy is an alternative way to visualize the entire small intestine without invasiveness to the patient. It is useful to characterize celiac disease, but hours are need to manually analyze the retrospective data of a single patient. Computer-aided quantitative analysis by a deep learning method helps in alleviating the workload during analysis of the retrospective videos. Capsule endoscopy clips from 6 celiac disease patients and 5 controls were preprocessed for training. The frames with a large field of opaque extraluminal fluid or air bubbles were removed automatically by using a pre-selection algorithm. Then the frames were cropped and the intensity was corrected prior to frame rotation in the proposed new method. The GoogLeNet is trained with these frames. Then, the clips of capsule endoscopy from 5 additional celiac disease patients and 5 additional control patients are used for testing. The trained GoogLeNet was able to distinguish the frames from capsule endoscopy clips of celiac disease patients vs controls. Quantitative measurement with evaluation of the confidence was developed to assess the severity level of pathology in the subjects. Relying on the evaluation confidence, the GoogLeNet achieved 100% sensitivity and specificity for the testing set. The t-test confirmed the evaluation confidence is significant to distinguish celiac disease patients from controls. Furthermore, it is found that the evaluation confidence may also relate to the severity level of small bowel mucosal lesions. A deep convolutional neural network was established for quantitative measurement of the existence and degree of pathology throughout the small intestine, which may improve computer-aided clinical techniques to assess mucosal atrophy and other etiologies in real-time with videocapsule endoscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    Science.gov (United States)

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  3. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method.

    Directory of Open Access Journals (Sweden)

    Ganglong Yang

    Full Text Available The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia, KK47 (low grade nonmuscle invasive bladder cancer, NMIBC, and YTS1 (metastatic bladder cancer have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer.

  4. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  5. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  6. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  7. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  8. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    International Nuclear Information System (INIS)

    Ryan, C.G.; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-01-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  9. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  10. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  11. Method of quantitative analysis of fluorine in environmental samples using a pure-Ge detector

    International Nuclear Information System (INIS)

    Sera, K.; Terasaki, K.; Saitoh, Y.; Itoh, J.; Futatsugawa, S.; Murao, S.; Sakurai, S.

    2004-01-01

    We recently developed and reported a three-detector measuring system making use of a pure-Ge detector combined with two Si(Li) detectors. The efficiency curve of the pure-Ge detector was determined as relative efficiencies to those of the existing Si(Li) detectors and accuracy of it was confirmed by analyzing a few samples whose elemental concentrations were known. It was found that detection of fluorine becomes possible by analyzing prompt γ-rays and the detection limit was found to be less than 0.1 ppm for water samples. In this work, a method of quantitative analysis of fluorine has been established in order to investigate environmental contamination by fluorine. This method is based on the fact that both characteristic x-rays from many elements and 110 keV prompt γ-rays from fluorine can be detected in the same spectrum. The present method is applied to analyses of a few environmental samples such as tealeaves, feed for domestic animals and human bone. The results are consistent with those obtained by other methods and it is found that the present method is quite useful and convenient for investigation studies on regional pollution by fluorine. (author)

  12. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  13. A method for the quantitative metallographic analysis of nuclear fuels (Programme QMA)

    International Nuclear Information System (INIS)

    Moreno, A.; Sari, C.

    1978-01-01

    A method is described for the quantitative analysis of features such as voids, cracks, phases, inclusions and grains distributed on random plane sections of fuel materials. An electronic image analyzer, Quantimet, attached to a MM6 Leitz microscope was used to measure size, area, perimeter and shape of features dispersed in a matrix. The apparatus is driven by a computer which calculates the size, area and perimeter distribution, form factors and orientation of the features as well as the inclusion content of the matrix expressed in weight per cent. A computer programme, QMA, executes the spatial correction of the measured two-dimensional sections and delivers the true distribution of feature sizes in a three-dimensional system

  14. A Sensitive Gold Nanoplasmonic SERS Quantitative Analysis Method for Sulfate in Serum Using Fullerene as Catalyst

    Directory of Open Access Journals (Sweden)

    Chongning Li

    2018-04-01

    Full Text Available Fullerene exhibited strong catalysis of the redox reaction between HAuCl4 and trisodium citrate to form gold nanoplasmon with a strong surface-enhanced Raman scattering (SERS effect at 1615 cm−1 in the presence of Vitoria blue B molecule probes. When fullerene increased, the SERS peak enhanced linearly due to formation of more AuNPs as substrate. Upon addition of Ba2+, Ba2+ ions adsorb on the fullerene surface to inhibit the catalysis of fullerene that caused the SERS peak decreasing. Analyte SO42− combined with Ba2+ to form stable BaSO4 precipitate to release free fullerene that the catalysis recovered, and the SERS intensity increased linearly. Thus, a new SERS quantitative analysis method was established for the detection of sulfate in serum samples, with a linear range of 0.03–3.4 μM.

  15. A standardless method of quantitative ceramic analysis using X-ray powder diffraction

    International Nuclear Information System (INIS)

    Mazumdar, S.

    1999-01-01

    A new procedure using X-ray powder diffraction data for quantitative estimation of the crystalline as well as the amorphous phase in ceramics is described. Classification of the crystalline and amorphous X-ray scattering was achieved by comparison of the slopes at two successive points of the powder pattern at scattering angles at which the crystalline and amorphous phases superimpose. If the second slope exceeds the first by a stipulated value, the intensity is taken as crystalline; otherwise the scattering is considered as amorphous. Crystalline phase analysis is obtained by linear programming techniques using the concept that each observed X-ray diffraction peak has contributions from n component phases, the proportionate analysis of which is required. The method does not require the measurement of calibration data for use as an internal standard, but knowledge of the approximate crystal structure of each phase of interest in the mixture is necessary. The technique is also helpful in qualitative analysis because each suspected phase is characterized by the probability that it will be present when a reflection zone is considered in which the suspected crystalline phase could contribute. The amorphous phases are determined prior to the crystalline ones. The method is applied to ceramic materials and some results are presented. (orig.)

  16. A relative quantitative Methylation-Sensitive Amplified Polymorphism (MSAP) method for the analysis of abiotic stress.

    Science.gov (United States)

    Bednarek, Piotr T; Orłowska, Renata; Niedziela, Agnieszka

    2017-04-21

    We present a new methylation-sensitive amplified polymorphism (MSAP) approach for the evaluation of relative quantitative characteristics such as demethylation, de novo methylation, and preservation of methylation status of CCGG sequences, which are recognized by the isoschizomers HpaII and MspI. We applied the technique to analyze aluminum (Al)-tolerant and non-tolerant control and Al-stressed inbred triticale lines. The approach is based on detailed analysis of events affecting HpaII and MspI restriction sites in control and stressed samples, and takes advantage of molecular marker profiles generated by EcoRI/HpaII and EcoRI/MspI MSAP platforms. Five Al-tolerant and five non-tolerant triticale lines were exposed to aluminum stress using the physiologicaltest. Total genomic DNA was isolated from root tips of all tolerant and non-tolerant lines before and after Al stress following metAFLP and MSAP approaches. Based on codes reflecting events affecting cytosines within a given restriction site recognized by HpaII and MspI in control and stressed samples demethylation (DM), de novo methylation (DNM), preservation of methylated sites (MSP), and preservation of nonmethylatedsites (NMSP) were evaluated. MSAP profiles were used for Agglomerative hierarchicalclustering (AHC) based on Squared Euclidean distance and Ward's Agglomeration method whereas MSAP characteristics for ANOVA. Relative quantitative MSAP analysis revealed that both Al-tolerant and non-tolerant triticale lines subjected to Al stress underwent demethylation, with demethylation of CG predominating over CHG. The rate of de novo methylation in the CG context was ~3-fold lower than demethylation, whereas de novo methylation of CHG was observed only in Al-tolerant lines. Our relative quantitative MSAP approach, based on methylation events affecting cytosines within HpaII-MspI recognition sequences, was capable of quantifying de novo methylation, demethylation, methylation, and non-methylated status in control

  17. Three-way methods for the analysis of qualitative and quantitative two-way data.

    NARCIS (Netherlands)

    Kiers, Hendrik Albert Lambertus

    1989-01-01

    A problem often occurring in exploratory data analysis is how to summarize large numbers of variables in terms of a smaller number of dimensions. When the variables are quantitative, one may resort to Principal Components Analysis (PCA). When qualitative (categorical) variables are involved, one may

  18. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A., E-mail: clayton.pereira.silva@usp.b [Instituto de Pesquisas Energeticas e Nucleares (CQMA/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2011-07-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50{mu}g. g{sup -1} and uncertainty was shorter than 10% for the determined elements. (author)

  19. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    International Nuclear Information System (INIS)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A.

    2011-01-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50μg. g -1 and uncertainty was shorter than 10% for the determined elements. (author)

  20. Preparation of Biological Samples Containing Metoprolol and Bisoprolol for Applying Methods for Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Corina Mahu Ştefania

    2015-12-01

    Full Text Available Arterial hypertension is a complex disease with many serious complications, representing a leading cause of mortality. Selective beta-blockers such as metoprolol and bisoprolol are frequently used in the management of hypertension. Numerous analytical methods have been developed for the determination of these substances in biological fluids, such as liquid chromatography coupled with mass spectrometry, gas chromatography coupled with mass spectrometry, high performance liquid chromatography. Due to the complex composition of biological fluids a biological sample pre-treatment before the use of the method for quantitative determination is required in order to remove proteins and potential interferences. The most commonly used methods for processing biological samples containing metoprolol and bisoprolol were identified through a thorough literature search using PubMed, ScienceDirect, and Willey Journals databases. Articles published between years 2005-2015 were reviewed. Protein precipitation, liquid-liquid extraction and solid phase extraction are the main techniques for the extraction of these drugs from plasma, serum, whole blood and urine samples. In addition, numerous other techniques have been developed for the preparation of biological samples, such as dispersive liquid-liquid microextraction, carrier-mediated liquid phase microextraction, hollow fiber-protected liquid phase microextraction, on-line molecularly imprinted solid phase extraction. The analysis of metoprolol and bisoprolol in human plasma, urine and other biological fluids provides important information in clinical and toxicological trials, thus requiring the application of appropriate extraction techniques for the detection of these antihypertensive substances at nanogram and picogram levels.

  1. An improved method for quantitative magneto-optical analysis of superconductors

    International Nuclear Information System (INIS)

    Laviano, F; Botta, D; Chiodoni, A; Gerbaldo, R; Ghigo, G; Gozzelino, L; Zannella, S; Mezzetti, E

    2003-01-01

    We report on the analysis method to extract quantitative local electrodynamics in superconductors by means of the magneto-optical technique. First of all, we discuss the calibration procedure to convert the local light intensity values into magnetic induction field distribution and start focusing on the role played by the generally disregarded magnetic induction components parallel to the indicator film plane (in-plane field effect). To account for the reliability of the whole technique, the method used to reconstruct the electrical current density distribution is reported, together with a numerical test example. The methodology is applied to measure local magnetic field and current distributions on a typical YBa 2 Cu 3 O 7-x good quality film. We show how the in-plane field influences the MO measurements, after which we present an algorithm to account for the in-plane field components. The meaningful impact of the correction on the experimental results is shown. Afterwards, we discuss some aspects about the electrodynamics of the superconducting sample

  2. Spin echo SPI methods for quantitative analysis of fluids in porous media.

    Science.gov (United States)

    Li, Linqing; Han, Hui; Balcom, Bruce J

    2009-06-01

    Fluid density imaging is highly desirable in a wide variety of porous media measurements. The SPRITE class of MRI methods has proven to be robust and general in their ability to generate density images in porous media, however the short encoding times required, with correspondingly high magnetic field gradient strengths and filter widths, and low flip angle RF pulses, yield sub-optimal S/N images, especially at low static field strength. This paper explores two implementations of pure phase encode spin echo 1D imaging, with application to a proposed new petroleum reservoir core analysis measurement. In the first implementation of the pulse sequence, we modify the spin echo single point imaging (SE-SPI) technique to acquire the k-space origin data point, with a near zero evolution time, from the free induction decay (FID) following a 90 degrees excitation pulse. Subsequent k-space data points are acquired by separately phase encoding individual echoes in a multi-echo acquisition. T(2) attenuation of the echo train yields an image convolution which causes blurring. The T(2) blur effect is moderate for porous media with T(2) lifetime distributions longer than 5 ms. As a robust, high S/N, and fast 1D imaging method, this method will be highly complementary to SPRITE techniques for the quantitative analysis of fluid content in porous media. In the second implementation of the SE-SPI pulse sequence, modification of the basic measurement permits fast determination of spatially resolved T(2) distributions in porous media through separately phase encoding each echo in a multi-echo CPMG pulse train. An individual T(2) weighted image may be acquired from each echo. The echo time (TE) of each T(2) weighted image may be reduced to 500 micros or less. These profiles can be fit to extract a T(2) distribution from each pixel employing a variety of standard inverse Laplace transform methods. Fluid content 1D images are produced as an essential by product of determining the

  3. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  4. Validation of quantitative 1H NMR method for the analysis of pharmaceutical formulations

    International Nuclear Information System (INIS)

    Santos, Maiara da S.

    2013-01-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  5. Quantitative analysis of multiple high-resolution mass spectrometry images using chemometric methods: quantitation of chlordecone in mouse liver.

    Science.gov (United States)

    Mohammadi, Saeedeh; Parastar, Hadi

    2018-05-15

    In this work, a chemometrics-based strategy is developed for quantitative mass spectrometry imaging (MSI). In this regard, quantification of chlordecone as a carcinogenic organochlorinated pesticide (C10Cll0O) in mouse liver using the matrix-assisted laser desorption ionization MSI (MALDI-MSI) method is used as a case study. The MSI datasets corresponded to 1, 5 and 10 days of mouse exposure to the standard chlordecone in the quantity range of 0 to 450 μg g-1. The binning approach in the m/z direction is used to group high resolution m/z values and to reduce the big data size. To consider the effect of bin size on the quality of results, three different bin sizes of 0.25, 0.5 and 1.0 were chosen. Afterwards, three-way MSI data arrays (two spatial and one m/z dimensions) for seven standards and four unknown samples were column-wise augmented with m/z values as the common mode. Then, these datasets were analyzed using multivariate curve resolution-alternating least squares (MCR-ALS) using proper constraints. The resolved mass spectra were used for identification of chlordecone in the presence of a complex background and interference. Additionally, the augmented spatial profiles were post-processed and 2D images for each component were obtained in calibration and unknown samples. The sum of these profiles was utilized to set the calibration curve and to obtain the analytical figures of merit (AFOMs). Inspection of the results showed that the lower bin size (i.e., 0.25) provides more accurate results. Finally, the obtained results by MCR for three datasets were compared with those of gas chromatography-mass spectrometry (GC-MS) and MALDI-MSI. The results showed that the MCR-assisted method gives a higher amount of chlordecone than MALDI-MSI and a lower amount than GC-MS. It is concluded that a combination of chemometric methods with MSI can be considered as an alternative way for MSI quantification.

  6. A method for quantitative analysis of clump thickness in cervical cytology slides.

    Science.gov (United States)

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1 μm in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30 μm thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10 μm and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1 μm spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1 μm spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Effect of data quality on quantitative phase analysis (QPA) using the Rietveld method

    International Nuclear Information System (INIS)

    Scarlett, N.; Madsen, I.; Lwin, T.

    1999-01-01

    Full text: Quantitative phase analysis using the Rietveld method has become a valuable tool in modern X-ray diffraction. XRD is a recognised research tool and has been successfully employed in the developmental stages of many industrial processes. It is now becoming increasingly important as a means of process control either (i) in site quality control laboratories or (ii) even on-line. In on-line applications, the optimisation of data collection regimes is of critical importance if rapid turn-around, and hence timely process control, is to be achieved. This paper examines the effect of data quality on the quantification of phases in well characterised suites of minerals. A range of data collection regimes has been systematically investigated with a view to determining the minimum data required for acceptable quantitative phase analyses. Data has been collected with variations in the following process factors: 1st step, width ranging from 0.01 to 0.3 deg 2θ ;2nd step, counting time ranging from 0.0125 to 4 sec/step 3rd step, upper limit in the scan range varying from 40 to 148 deg 2θ. The data has been analysed using whole-pattern (Rietveld) based methods using two distinctly different analytical approaches: (i) refinement of only pattern background and individual scale factors for each phase; (ii) refinement of unit cell dimensions, overall thermal parameters, peak width and shape in addition to the background and scale factors. The experimental design for this work included a ternary design of the three component phases (fluorite, CaF 2 ; zincite, ZnO; corundum, Al 2 O 3 ) to form seven mixtures of major and minor phases of different scattering powers and the combination of the three process factors (variables) to form a factorial plan. The final data generation plan is a combination/crossing of the three process variable factorial plan with the three component mixture plan. It allows a detailed data analysis to provide information on the effect of the process

  8. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    Science.gov (United States)

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Quantitative Analysis Method of Output Loss due to Restriction for Grid-connected PV Systems

    Science.gov (United States)

    Ueda, Yuzuru; Oozeki, Takashi; Kurokawa, Kosuke; Itou, Takamitsu; Kitamura, Kiyoyuki; Miyamoto, Yusuke; Yokota, Masaharu; Sugihara, Hiroyuki

    Voltage of power distribution line will be increased due to reverse power flow from grid-connected PV systems. In the case of high density grid connection, amount of voltage increasing will be higher than the stand-alone grid connection system. To prevent the over voltage of power distribution line, PV system's output will be restricted if the voltage of power distribution line is close to the upper limit of the control range. Because of this interaction, amount of output loss will be larger in high density case. This research developed a quantitative analysis method for PV systems output and losses to clarify the behavior of grid connected PV systems. All the measured data are classified into the loss factors using 1 minute average of 1 second data instead of typical 1 hour average. Operation point on the I-V curve is estimated to quantify the loss due to the output restriction using module temperature, array output voltage, array output current and solar irradiance. As a result, loss due to output restriction is successfully quantified and behavior of output restriction is clarified.

  10. A method of non-destructive quantitative analysis of the ancient ceramics with curved surface

    International Nuclear Information System (INIS)

    He Wenquan; Xiong Yingfei

    2002-01-01

    Generally the surface of the sample should be smooth and flat in XRF analysis, but the ancient ceramics and hardly match this condition. Two simple methods are put forward in fundamental method and empirical correction method of XRF analysis, so the analysis of little sample or the sample with curved surface can be easily completed

  11. Knee Kinematic Improvement After Total Knee Replacement Using a Simplified Quantitative Gait Analysis Method

    Directory of Open Access Journals (Sweden)

    Hassan Sarailoo

    2013-10-01

    Full Text Available Objectives: The aim of this study was to extract suitable spatiotemporal and kinematic parameters to determine how Total Knee Replacement (TKR alters patients’ knee kinematics during gait, using a rapid and simplified quantitative two-dimensional gait analysis procedure. Methods: Two-dimensional kinematic gait pattern of 10 participants were collected before and after the TKR surgery, using a 60 Hz camcorder in sagittal plane. Then, the kinematic parameters were extracted using the gait data. A student t-test was used to compare the group-average of spatiotemporal and peak kinematic characteristics in the sagittal plane. The knee condition was also evaluated using the Oxford Knee Score (OKS Questionnaire to ensure thateach subject was placed in the right group. Results: The results showed a significant improvement in knee flexion during stance and swing phases after TKR surgery. The walking speed was increased as a result of stride length and cadence improvement, but this increment was not statistically significant. Both post-TKR and control groups showed an increment in spatiotemporal and peak kinematic characteristics between comfortable and fast walking speeds. Discussion: The objective kinematic parameters extracted from 2D gait data were able to show significant improvements of the knee joint after TKR surgery. The patients with TKR surgery were also able to improve their knee kinematics during fast walking speed equal to the control group. These results provide a good insight into the capabilities of the presented method to evaluate knee functionality before and after TKR surgery and to define a more effective rehabilitation program.

  12. A relative quantitative Methylation-Sensitive Amplified Polymorphism (MSAP) method for the analysis of abiotic stress

    OpenAIRE

    Bednarek, Piotr T.; Or?owska, Renata; Niedziela, Agnieszka

    2017-01-01

    Background We present a new methylation-sensitive amplified polymorphism (MSAP) approach for the evaluation of relative quantitative characteristics such as demethylation, de novo methylation, and preservation of methylation status of CCGG sequences, which are recognized by the isoschizomers HpaII and MspI. We applied the technique to analyze aluminum (Al)-tolerant and non-tolerant control and Al-stressed inbred triticale lines. The approach is based on detailed analysis of events affecting H...

  13. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  14. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    Science.gov (United States)

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  15. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    International Nuclear Information System (INIS)

    Chen, Q G; Xu, Y; Zhu, H H; Chen, H; Lin, B

    2015-01-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565–750 nm. The spectral parameter, defined as the ratio of wavebands at 565–750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66–1.06, 1.06–1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems. (paper)

  16. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    Science.gov (United States)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as 1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  17. Validation of quantitative analysis method for triamcinolone in ternary complexes by UV-Vis spectrophotometry

    Directory of Open Access Journals (Sweden)

    GEORGE DARLOS A. AQUINO

    2011-06-01

    Full Text Available Triamcinolone (TRI, a drug widely used in the treatment of ocular inflammatory diseases, is practically insoluble in water, which limits its use in eye drops. Cyclodextrins (CDs have been used to increase the solubility or dissolution rate of drugs. The purpose of the present study was to validate a UV-Vis spectrophotometric method for quantitative analysis of TRI in inclusion complexes with beta-cyclodextrin (B-CD associated with triethanolamine (TEA (ternary complex. The proposed analytical method was validated with respect to the parameters established by the Brazilian regulatory National Agency of Sanitary Monitoring (ANVISA. The analytical measurements of absorbance were made at 242nm, at room temperature, in a 1-cm path-length cuvette. The precision and accuracy studies were performed at five concentration levels (4, 8, 12, 18 and 20μg.mL-1. The B-CD associated with TEA did not provoke any alteration in the photochemical behavior of TRI. The results for the measured analytical parameters showed the success of the method. The standard curve was linear (r2 > 0.999 in the concentration range from 2 to 24 μg.mL-1. The method achieved good precision levels in the inter-day (relative standard deviation-RSD <3.4% and reproducibility (RSD <3.8% tests. The accuracy was about 80% and the pH changes introduced in the robustness study did not reveal any relevant interference at any of the studied concentrations. The experimental results demonstrate a simple, rapid and affordable UV-Vis spectrophotometric method that could be applied to the quantitation of TRI in this ternary complex. Keywords: Validation. Triamcinolone. Beta-cyclodextrin. UV- Vis spectrophotometry. Ternary complexes. RESUMO Validação de método de análise quantitativa para a triancinolona a partir de complexo ternário por espectrofotometria de UV-Vis A triancinolona (TRI é um fármaco amplamente utilizado no tratamento de doenças inflamatórias do globo ocular e

  18. Quantitative Evaluation of gamma-Spectrum Analysis Methods using IAEA Test Spectra

    DEFF Research Database (Denmark)

    Nielsen, Sven Poul

    1982-01-01

    A description is given of a γ-spectrum analysis method based on nonlinear least-squares fitting. The quality of the method is investigated by using statistical tests on the results from analyses of IAEA test spectra. By applying an empirical correction factor of 0.75 to the calculated peak-area u...

  19. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  20. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  1. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  2. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  3. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  4. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  5. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  6. A novel method for rapid comparative quantitative analysis of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Eastham, Sebastian D.; Coates, David J.; Parks, Geoffrey T.

    2012-01-01

    Highlights: ► Metric framework determined to compare nuclear fuel cycles. ► Fast and thermal reactors simulated using MATLAB models, including thorium. ► Modelling uses deterministic methods instead of Monte–Carlo for speed. ► Method rapidly identifies relative cycle strengths and weaknesses. ► Significant scope for use in project planning and cycle optimisation. - Abstract: One of the greatest obstacles facing the nuclear industry is that of sustainability, both in terms of the finite reserves of uranium ore and the production of highly radiotoxic spent fuel which presents proliferation and environmental hazards. Alternative nuclear technologies have been suggested as a means of delivering enhanced sustainability with proposals including fast reactors, the use of thorium fuel and tiered fuel cycles. The debate as to which is the most appropriate technology continues, with each fuel system and reactor type delivering specific advantages and disadvantages which can be difficult to compare fairly. This paper demonstrates a framework of performance metrics which, coupled with a first-order lumped reactor model to determine nuclide population balances, can be used to quantify the aforementioned pros and cons for a range of different fuel and reactor combinations. The framework includes metrics such as fuel efficiency, spent fuel toxicity and proliferation resistance, and relative cycle performance is analysed through parallel coordinate plots, yielding a quantitative comparison of disparate cycles.

  7. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    International Nuclear Information System (INIS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-01-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine. - Highlights: • Both training and testing samples are considered for analytical lines selection. • The analytical lines are auto-selected based on the built-in characteristics of spectral lines. • The new method can achieve better prediction accuracy and modeling robustness. • Model predictions are given with confidence interval of probabilistic distribution

  8. Development of quantitative methods for spill response planning: a trajectory analysis planner

    International Nuclear Information System (INIS)

    Galt, J.A.; Payton, D.L.

    1999-01-01

    In planning for response to oil spills, a great deal of information must be assimilated. Typically, geophysical flow patterns, ocean turbulence, complex chemical processes, ecological setting, fisheries activities, economics of land use, and engineering constraints on response equipment all need to be considered. This presents a formidable analysis problem. It can be shown, however, that if an appropriate set of evaluation data is available, an objective function and appropriate constraints can be formulated. From these equations, the response problem can be cast in terms of game theory of decision analysis and an optimal solution can be obtained using common scarce-resource allocation methods. The optimal solution obtained by this procedure maximises the expected return over all possible implementations of a given set of response options. While considering the development of an optimal spill response, it is useful to consider whether (in the absence of complete data) implementing some subset of these methods is possible to provide relevant and useful information for the spill planning process, even though it may fall short of a statistically optimal solution. In this work we introduce a trajectory analysis planning (TAP) methodology that can provide a cohesive framework for integrating physical transport processes, environmental sensitivity of regional sites, and potential response options. This trajectory analysis planning methodology can be shown to implement a significant part of the game theory analysis and provide 'minimum regret' strategy advice, without actually carrying out the optimisation procedures. (Author)

  9. A quality quantitative method of silicon direct bonding based on wavelet image analysis

    Science.gov (United States)

    Tan, Xiao; Tao, Zhi; Li, Haiwang; Xu, Tiantong; Yu, Mingxing

    2018-04-01

    The rapid development of MEMS (micro-electro-mechanical systems) has received significant attention from researchers in various fields and subjects. In particular, the MEMS fabrication process is elaborate and, as such, has been the focus of extensive research inquiries. However, in MEMS fabrication, component bonding is difficult to achieve and requires a complex approach. Thus, improvements in bonding quality are relatively important objectives. A higher quality bond can only be achieved with improved measurement and testing capabilities. In particular, the traditional testing methods mainly include infrared testing, tensile testing, and strength testing, despite the fact that using these methods to measure bond quality often results in low efficiency or destructive analysis. Therefore, this paper focuses on the development of a precise, nondestructive visual testing method based on wavelet image analysis that is shown to be highly effective in practice. The process of wavelet image analysis includes wavelet image denoising, wavelet image enhancement, and contrast enhancement, and as an end result, can display an image with low background noise. In addition, because the wavelet analysis software was developed with MATLAB, it can reveal the bonding boundaries and bonding rates to precisely indicate the bond quality at all locations on the wafer. This work also presents a set of orthogonal experiments that consist of three prebonding factors, the prebonding temperature, the positive pressure value and the prebonding time, which are used to analyze the prebonding quality. This method was used to quantify the quality of silicon-to-silicon wafer bonding, yielding standard treatment quantities that could be practical for large-scale use.

  10. Validation of the method of quantitative phase analysis by X-ray diffraction in API: case of Tibolone

    International Nuclear Information System (INIS)

    Silva, R P; Ambrósio, M F S; Epprecht, E K; Avillez, R R; Achete, C A; Kuznetsov, A; Visentin, L C

    2016-01-01

    In this study, different structural and microstructural models applied to X-ray analysis of powder diffraction data of polymorphic mixtures of known concentrations of Tibolone were investigated. The X-ray data obtained in different diffraction instruments were analysed via Rietveld method using the same analytical models. The results of quantitative phase analysis show that regardless of the instrument used, the values of the calculated concentrations follow the same systematics with respect to the final errors. The strategy to select a specific analytical model that leads to lower measurement errors is here presented. (paper)

  11. Improvement in precision and trueness of quantitative XRF analysis with glass-bead method. 1

    International Nuclear Information System (INIS)

    Yamamoto, Yasuyuki; Ogasawara, Noriko; Yuhara, Yoshitaroh; Yokoyama, Yuichi

    1995-01-01

    The factors which lower the precisions of simultaneous X-ray Fluorescence (XRF) spectrometer were investigated. Especially in quantitative analyses of oxide powders with glass-bead method, X-ray optical characteristics of the equipment affects the precision of the X-ray intensities. In focused (curved) crystal spectrometers, the precision depends on the deviation of the actual size and position of the crystals from those of theoretical designs, thus the precision differs for each crystal for each element. When the deviation is large, a dispersion of the measured X-ray intensities is larger than the statistical dispersion, even though the intensity itself keeps unchanged. Moreover, a waviness of the surface of glass-beads makes the difference of the height of an analyzed surface from that of the designed one. This difference makes the change of the amount of the X-ray incident on the analyzing crystal and makes the dispersion of the X-ray intensity larger. Considering these factors, a level of the waviness must be regulated to improve the precision under exsisting XRF equipments. In this study, measurement precisions of 4 simultaneous XRF spectrometers were evaluated, and the element lead (Pb-Lβ1) was found to have the lowest precision. Relative standard deviation (RSD) of the measurements of 10 glass-beads for the same powder sample was 0.3% without the regulation of the waviness of analytical surface. With mechanical flattening of the glass-bead surface, the level of waviness, which is the maximum difference of the heights in a glass-bead, was regulated as under 30 μm, RSD was 0.038%, which is almost comparable to the statistical RSD 0.033%. (author)

  12. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-01-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  13. An easy and inexpensive method for quantitative analysis of endothelial damage by using vital dye staining and Adobe Photoshop software.

    Science.gov (United States)

    Saad, Hisham A; Terry, Mark A; Shamie, Neda; Chen, Edwin S; Friend, Daniel F; Holiman, Jeffrey D; Stoeger, Christopher

    2008-08-01

    We developed a simple, practical, and inexpensive technique to analyze areas of endothelial cell loss and/or damage over the entire corneal area after vital dye staining by using a readily available, off-the-shelf, consumer software program, Adobe Photoshop. The purpose of this article is to convey a method of quantifying areas of cell loss and/or damage. Descemet-stripping automated endothelial keratoplasty corneal transplant surgery was performed by using 5 precut corneas on a human cadaver eye. Corneas were removed and stained with trypan blue and alizarin red S and subsequently photographed. Quantitative assessment of endothelial damage was performed by using Adobe Photoshop 7.0 software. The average difference for cell area damage for analyses performed by 1 observer twice was 1.41%. For analyses performed by 2 observers, the average difference was 1.71%. Three masked observers were 100% successful in matching the randomized stained corneas to their randomized processed Adobe images. Vital dye staining of corneal endothelial cells can be combined with Adobe Photoshop software to yield a quantitative assessment of areas of acute endothelial cell loss and/or damage. This described technique holds promise for a more consistent and accurate method to evaluate the surgical trauma to the endothelial cell layer in laboratory models. This method of quantitative analysis can probably be generalized to any area of research that involves areas that are differentiated by color or contrast.

  14. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  15. A novel quantitative analysis method of three-dimensional fluorescence spectra for vegetable oils contents in edible blend oil

    Science.gov (United States)

    Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei

    2015-04-01

    Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.

  16. Fundamental and clinical studies on simultaneous, quantitative analysis of hepatobiliary and gastrointestinal scintigrams using double isotope method

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Y; Kakihara, M; Sasaki, M; Tabuse, Y; Takei, N [Wakayama Medical Coll. (Japan)

    1981-04-01

    Double isotope method was applied to carry out simultaneous and quantitative analysis of hepatobiliary and gastrointestinal scintigrams. A scinticamera with parallel collimator for medium energy was connected to a computer to distinguish the two isotopes at a time. 4mCi of sup(99m)Tc-(Sn)-pyridoxylideneisoleucine (Tc-PI) and 200 ..mu..Ci of /sup 111/In-diethylenetriaminepentaacetic acid (In-DTPA) were administrated by i.v. injection and per oral, respectively. Three normal (two women and a man) and 16 patients after the operation of gastric cancer (10 recovered by Roux-en Y method after the total gastrectomy, and 6 recovered after the operation replacing the jejunum between the esophagus and duodenum) were investigated. The process of bile secretion and its mixing with food were followed by the scanning quantitatively. The analysis of time-activity variation at each organ indicated that the replacing operation gave more physiological recovery than that by Roux-en Y method. This method is noninvasive to patients and is promising to follow the process or activity of digestion in any digestive organ after surgery.

  17. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    Science.gov (United States)

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  18. Interlaboratory validation of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Takabatake, Reona; Koiwa, Tomohiro; Kasahara, Masaki; Takashima, Kaori; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Oguchi, Taichi; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    To reduce the cost and time required to routinely perform the genetically modified organism (GMO) test, we developed a duplex quantitative real-time PCR method for a screening analysis simultaneously targeting an event-specific segment for GA21 and Cauliflower Mosaic Virus 35S promoter (P35S) segment [Oguchi et al., J. Food Hyg. Soc. Japan, 50, 117-125 (2009)]. To confirm the validity of the method, an interlaboratory collaborative study was conducted. In the collaborative study, conversion factors (Cfs), which are required to calculate the GMO amount (%), were first determined for two real-time PCR instruments, the ABI PRISM 7900HT and the ABI PRISM 7500. A blind test was then conducted. The limit of quantitation for both GA21 and P35S was estimated to be 0.5% or less. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)). The determined bias and RSD(R) were each less than 25%. We believe the developed method would be useful for the practical screening analysis of GM maize.

  19. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    Science.gov (United States)

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  20. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  1. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  2. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    Science.gov (United States)

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  3. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  4. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  5. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  6. SAFER, an Analysis Method of Quantitative Proteomic Data, Reveals New Interactors of the C. elegans Autophagic Protein LGG-1.

    Science.gov (United States)

    Yi, Zhou; Manil-Ségalen, Marion; Sago, Laila; Glatigny, Annie; Redeker, Virginie; Legouis, Renaud; Mucchielli-Giorgi, Marie-Hélène

    2016-05-06

    Affinity purifications followed by mass spectrometric analysis are used to identify protein-protein interactions. Because quantitative proteomic data are noisy, it is necessary to develop statistical methods to eliminate false-positives and identify true partners. We present here a novel approach for filtering false interactors, named "SAFER" for mass Spectrometry data Analysis by Filtering of Experimental Replicates, which is based on the reproducibility of the replicates and the fold-change of the protein intensities between bait and control. To identify regulators or targets of autophagy, we characterized the interactors of LGG1, a ubiquitin-like protein involved in autophagosome formation in C. elegans. LGG-1 partners were purified by affinity, analyzed by nanoLC-MS/MS mass spectrometry, and quantified by a label-free proteomic approach based on the mass spectrometric signal intensity of peptide precursor ions. Because the selection of confident interactions depends on the method used for statistical analysis, we compared SAFER with several statistical tests and different scoring algorithms on this set of data. We show that SAFER recovers high-confidence interactors that have been ignored by the other methods and identified new candidates involved in the autophagy process. We further validated our method on a public data set and conclude that SAFER notably improves the identification of protein interactors.

  7. Method for the routine quantitative gas chromatographic analysis of major free fatty acids in butter and cream.

    Science.gov (United States)

    Woo, A H; Lindsay, R C

    1980-07-01

    A rapid quantiative method was developed for routine analysis of the major, even carbon-numbered free fatty acids in butter and cream. Free fatty acids were isolated directly from intact samples by a modified silicic acid-potassium hydroxide arrestant column and were separated by gas chromatography with a 1.8 m x 2 mm inner diameter glass column packed with 10% neopentyl glycol adipate on 80/100 Chromosorb W. Purified, formic acid-saturated carrier gas was required for minimal peak tailing and extended column life. The accuracy and reproducibility of the mmethod was established through quantitative recovery studies of free fatty acid mixtures, free fatty acids added to butter, and replicate analysis of butter and cream samples.

  8. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    .... Today's threat is nondoctrinal and asymmetric. The means used to counter this threat are based on standardized equipment and capabilities, and they rely on template-based analysis and network diagrams...

  9. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  10. General method of quantitative spectrographic analysis; Estudio de un metodo general de analisis espectrografico cuantitativo

    Energy Technology Data Exchange (ETDEWEB)

    Capdevila, C; Roca, M

    1966-07-01

    A spectrographic method was developed to determine 23 elements in a wide range of concentrations; the method can be applied to metallic or refractory samples. Previous melting with lithium tetraborate and germanium oxide is done in order to avoid the influence of matrix composition and crystalline structure. Germanium oxide is also employed as internal standard. The resulting beads ar mixed with graphite powder (1:1) and excited in a 10 amperes direct current arc. (Author) 12 refs.

  11. Methods of quantitative and qualitative analysis of bird migration with a tracking radar

    Science.gov (United States)

    Bruderer, B.; Steidinger, P.

    1972-01-01

    Methods of analyzing bird migration by using tracking radar are discussed. The procedure for assessing the rate of bird passage is described. Three topics are presented concerning the grouping of nocturnal migrants, the velocity of migratory flight, and identification of species by radar echoes. The height and volume of migration under different weather conditions are examined. The methods for studying the directions of migration and the correlation between winds and the height and direction of migrating birds are presented.

  12. Artificial intelligence methods applied for quantitative analysis of natural radioactive sources

    International Nuclear Information System (INIS)

    Medhat, M.E.

    2012-01-01

    Highlights: ► Basic description of artificial neural networks. ► Natural gamma ray sources and problem of detections. ► Application of neural network for peak detection and activity determination. - Abstract: Artificial neural network (ANN) represents one of artificial intelligence methods in the field of modeling and uncertainty in different applications. The objective of the proposed work was focused to apply ANN to identify isotopes and to predict uncertainties of their activities of some natural radioactive sources. The method was tested for analyzing gamma-ray spectra emitted from natural radionuclides in soil samples detected by a high-resolution gamma-ray spectrometry based on HPGe (high purity germanium). The principle of the suggested method is described, including, relevant input parameters definition, input data scaling and networks training. It is clear that there is satisfactory agreement between obtained and predicted results using neural network.

  13. Quantitative analysis of scaling error compensation methods in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Müller, P.; Hiller, Jochen; Dai, Y.

    2015-01-01

    X-ray Computed Tomography (CT) has become an important technology for quality control of industrial components. As with other technologies, e.g., tactile coordinate measurements or optical measurements, CT is influenced by numerous quantities which may have negative impact on the accuracy...... errors of the manipulator system (magnification axis). This article also introduces a new compensation method for scaling errors using a database of reference scaling factors and discusses its advantages and disadvantages. In total, three methods for the correction of scaling errors – using the CT ball...

  14. Radioimmunoassay (RIA), a highly specific, extremely sensitive quantitative method of analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Hachmann, H; Seidel, L [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1979-02-01

    Radioimmunoassay is an analytical method combining the sensitivity of radioactivity measurements and the specificity of the antigen-antibody-reaction. Thus, substances can be measured in concentrations as low as picograms per milliliter serum besides a millionfold excess of otherwise disturbing material (for example in serum). The method is simple to perform and is at present mainly used in the field of endocrinology. Further areas of possible application are in the diagnosis of infectious disease, drug research, environmental protection, forensic medicine as well as general analytics. Quantities of radioactivity, exclusively used in vitro, are in the nano-Curie range. Therefore the radiation dose is negligible.

  15. Development of a quantitative method for the analysis of cocaine analogue impregnated into textiles by Raman spectroscopy.

    Science.gov (United States)

    Xiao, Linda; Alder, Rhiannon; Mehta, Megha; Krayem, Nadine; Cavasinni, Bianca; Laracy, Sean; Cameron, Shane; Fu, Shanlin

    2018-04-01

    Cocaine trafficking in the form of textile impregnation is routinely encountered as a concealment method. Raman spectroscopy has been a popular and successful testing method used for in situ screening of cocaine in textiles and other matrices. Quantitative analysis of cocaine in these matrices using Raman spectroscopy has not been reported to date. This study aimed to develop a simple Raman method for quantifying cocaine using atropine as the model analogue in various types of textiles. Textiles were impregnated with solutions of atropine in methanol. The impregnated atropine was extracted using less hazardous acidified water with the addition of potassium thiocyanate (KSCN) as an internal standard for Raman analysis. Despite the presence of background matrix signals arising from the textiles, the cocaine analogue could easily be identified by its characteristic Raman bands. The successful use of KSCN normalised the analyte signal response due to different textile matrix background interferences and thus removed the need for a matrix-matched calibration. The method was linear over a concentration range of 6.25-37.5 mg/cm 2 with a coefficient of determination (R 2 ) at 0.975 and acceptable precision and accuracy. A simple and accurate Raman spectroscopy method for the analysis and quantification of a cocaine analogue impregnated in textiles has been developed and validated for the first time. This proof-of-concept study has demonstrated that atropine can act as an ideal model compound to study the problem of cocaine impregnation in textile. The method has the potential to be further developed and implemented in real world forensic cases. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Implementation of quantitative methods of analysis of the image quality in mammography

    International Nuclear Information System (INIS)

    Ruiz Rodriguez, Ana Mariela

    2012-01-01

    A proper diagnosis of a mammogram requires optimal contrast and maximum resolution, this allows to visualize different structures with low contrast and diameter of the order of millimeters. The evaluation of image quality allows to characterize the physical properties of the system of image acquisition, such assessment is one of the cornerstones in a quality control program. The image quality of the ACR phantom was evaluated with the IAEA TECDOC 1517 protocol. Analytical methods have determined quality parameters of the dummy image TOR MAS. A method is developed to analyze the image quality of both mammographic phantoms by use of computational techniques with the ImageJ program, in doing this, more information was able to objectively characterize the image quality of the dummies. Furthermore, these parameters will serve to compare the images obtained at different mammographic centers of the Caja Costarricense de Seguro Social (CCSS) and eventually the time evolution in a particular installation. (author) [es

  17. Comparison of reverse transcription-quantitative polymerase chain reaction methods and platforms for single cell gene expression analysis.

    Science.gov (United States)

    Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A

    2012-08-15

    Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Quantitative analysis of concrete using portable x-ray fluorescence: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Washington, Aaron L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Narrows, William [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Christian, Jonathan H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Msgwood, Leroy [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-07-27

    During Decommissioning and Demolition (D&D) activities at SRS, it is important that the building be screened for radionuclides and heavy metals to ensure that the proper safety and disposal metrics are in place. A major source of contamination at DOE facilities is the accumulation of mercury contamination, from nuclear material processing and Liquid Waste System (LWS). This buildup of mercury could possibly cause harm to any demolition crew or the environment should this material be released. The current standard method is to take core samples in various places in the facility and use X-ray fluorescence (XRF) to detect the contamination. This standard method comes with a high financial value due to the security levels of these sample facilities with unknown contamination levels. Here in we propose the use of portable XRF units to detect for this contamination on-site. To validate this method, the instrument has to be calibrated to detect the heavy metal contamination, be both precise with the known elemental concentrations and consistent with its actual results of a sample concrete and pristine contaminant, and be able to detect changes in the sample concrete’s composition. After receiving the various concrete samples with their compositions found by a XRF wave-dispersive method, the calibration factor’s linear regressions were adjusted to give the baseline concentration of the concrete with no contamination. Samples of both concrete and concrete/flyash were evaluated; their standard deviations revealed that the measurements were consistent with the known composition. Finally, the samples were contaminated with different concentrations of sodium tungsten dihydrate, allowed to air dry, and measured. When the contaminated samples were analyzed, the heavy metal contamination was seen within the spectrum of the instrument, but there was not a trend of quantification based on the concentration of the solution.

  19. Preparation of Biological Samples Containing Metoprolol and Bisoprolol for Applying Methods for Quantitative Analysis

    OpenAIRE

    Corina Mahu Ştefania; Monica Hăncianu; Luminiţa Agoroaei; Anda Cristina Coman Băbuşanu; Elena Butnaru

    2015-01-01

    Arterial hypertension is a complex disease with many serious complications, representing a leading cause of mortality. Selective beta-blockers such as metoprolol and bisoprolol are frequently used in the management of hypertension. Numerous analytical methods have been developed for the determination of these substances in biological fluids, such as liquid chromatography coupled with mass spectrometry, gas chromatography coupled with mass spectrometry, high performance liquid chromatography. ...

  20. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    A description of the most frequently used approaches for human reliability assesment is given. The relation between different human factor causes for human induced events in Kozloduy NPP during the period 2000 - 2003 is discussed. A comparison between the contribution of the casual factors for events occurrences in Kozloduy NPP and an Japanese NPP is presented. It can be concluded that for both NPPs the most important casual factors are: 1) written procedures and documents; 2) man-machine interface; 3) environmental working conditions; 4) working practice; 5) training and qualification; 6) supervising methods

  1. Quantitative Analysis of the Waterline Method for Topographical Mapping of Tidal Flats: A Case Study in the Dongsha Sandbank, China

    Directory of Open Access Journals (Sweden)

    Yongxue Liu

    2013-11-01

    Full Text Available Although the topography of tidal flats is important for understanding their evolution, the spatial and temporal sampling frequency of such data remains limited. The waterline method has the potential to retrieve past tidal flat topography by utilizing large archives of satellite images. This study performs a quantitative analysis of the relationship between the accuracy of tidal flat digital elevation models (DEMs that are based on the waterline method and the factors that influence the DEMs. The three major conclusions of the study are as follows: (1 the coverage rate of the waterline points and the number of satellite images used to create the DEM are highly linearly correlated with the error of the resultant DEMs, and the former is more significant in indicating the accuracy of the resultant DEMs than the latter; (2 both the area and the slope of the tidal flats are linearly correlated with the error of the resultant DEMs; and (3 the availability analysis of the archived satellite images indicates that the waterline method can retrieve tidal flat terrains from the past forty years. The upper limit of the temporal resolution of the tidal flat DEM can be refined to within one year since 1993, to half a year since 2004 and to three months since 2009.

  2. Fundamental parameters method for quantitative energy dispersive x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Demirel, H.; Zararsiz, A.

    1986-01-01

    In this study, the requirement of the standart material in photon excited energy distributed X-ray fluorescence analysis has been removed. The interaction of X-rays with matter has been taken into account. A computer program has been developed by using the fundamental parameters of X-ray fluorescence technique and the spectral intensity 'K' of pure elements at saturation thickness has been obtained. For experimental purpose a convenient source-target-detector geometry has been designed. In order to excite the samples,Cd-109 radioisotope source has been used. The peak intensities has been obtained in a vacum chamber by counting the emitted X-rays. The calculation of concentration has been performed for double mixed samples correcting the effects of absorption and enchancement factors. The results were in conformity with their certificate values. (author)

  3. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  4. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  6. A quantitative validated method using liquid chromatography and chemometric analysis for evaluation of raw material oF Maytenus ilicifolia (Schrad.) Planch., Celastraceae

    OpenAIRE

    Beltrame, Flávio Luís; Mainardes, Rubiana Mara; Khalil, Najeh Maissar; Prestes, Rosilene Aparecida; Nogueira, Alessandro; Demiate, Ivo Mottin; Cass, Quezia Bezerra

    2012-01-01

    The hydroalcoholic extracts prepared from standard leaves of Maytenus ilicifolia and commercial samples of espinheira-santa were evaluated qualitatively (fingerprinting) and quantitatively. In this paper, fingerprinting chromatogram coupled with Principal Component Analysis (PCA) is described for the metabolomic analysis of standard and commercial espinheira-santa samples. The epicatechin standard was used as an external standard for the development and validation of a quantitative method for...

  7. Qualitative and quantitative chemical investigation of orthopedic alloys by combining wet digestion, spectro analytical methods and direct solid analysis

    Energy Technology Data Exchange (ETDEWEB)

    Figueiredo, Caio M.; Castro, Jeyne P.; Sperança, Marco A.; Fialho, Lucimar L.; Nóbrega, Joaquim A.; Pereira-Filho, Edenir R., E-mail: erpf@ufscar.br [Universidade Federal de São Carlos (GAIA/UFSCar), SP (Brazil). Grupo de Análise Instrumental Aplicada

    2018-05-01

    In this study, two laser-based techniques, laser-induced breakdown spectroscopy (LIBS) and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) were used for analytical signal evaluation of Ti, Al, and V and investigation of possible harmful elements eventually present as minor elements in Ti alloys. Due to the lack of certified reference materials, samples were also analyzed by wavelength dispersive X-ray fluorescence (WDXRF) and inductively coupled plasma optical emission spectrometry (ICP OES) after microwave-assisted digestion. To maximize the efficiency of LIBS and LA-ICP-MS, operational conditions were adjusted aiming to find optimal analytical performance. LIBS showed several Ti emission lines and few signals for Al and V. LA-ICP-MS was able to detect all three major constituents. For quantitative analysis, the correlation of intensity signals from LIBS analysis with reference values obtained by ICP OES was not successful, showing that there are still difficulties for quantification using solid samples. Measurements using ICP OES showed that additionally to major constituents, only Fe was present in concentrations around 0.2%. Analysis by WDXRF confirmed the presence of Fe. Results using both methods, i.e., ICP OES and WDXRF, were in good agreement. (author)

  8. Handling large numbers of observation units in three-way methods for the analysis of qualitative and quantitative two-way data

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Marchetti, G.M.

    1994-01-01

    Recently, a number of methods have been proposed for the exploratory analysis of mixtures of qualitative and quantitative variables. In these methods for each variable an object by object similarity matrix is constructed, and these are consequently analyzed by means of three-way methods like

  9. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    Science.gov (United States)

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  10. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  11. A quantitative validated method using liquid chromatography and chemometric analysis for evaluation of raw material oF Maytenus ilicifolia (Schrad. Planch., Celastraceae

    Directory of Open Access Journals (Sweden)

    Flávio Luís Beltrame

    2012-01-01

    Full Text Available The hydroalcoholic extracts prepared from standard leaves of Maytenus ilicifolia and commercial samples of espinheira-santa were evaluated qualitatively (fingerprinting and quantitatively. In this paper, fingerprinting chromatogram coupled with Principal Component Analysis (PCA is described for the metabolomic analysis of standard and commercial espinheira-santa samples. The epicatechin standard was used as an external standard for the development and validation of a quantitative method for the analysis in herbal medicines using a photo diode array detector. This method has been applied for quantification of epicatechin in commercialized herbal medicines sold as espinheira-santa in Brazil and in the standard sample of M. ilicifolia.

  12. High-resolution gas chromatography/mas spectrometry method for characterization and quantitative analysis of ginkgolic acids in ginkgo biloba plants, extracts, and dietary supplements

    Science.gov (United States)

    A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...

  13. Quantitative phase analysis using the whole-powder-pattern decomposition method. Pt. 1. Solution from knowledge of chemical compositions

    International Nuclear Information System (INIS)

    Toraya, H.; Tusaka, S.

    1995-01-01

    A new procedure for quantitative phase analysis using the whole-powder-pattern decomposition method is proposed. The procedure consists of two steps. In the first, the whole powder patterns of single-component materials are decomposed separately. The refined parameters of integrated intensity, unit cell and profile shape for respective phases are stored in computer data files. In the second step, the whole powder pattern of a mixture sample is fitted, where the parameters refined in the previous step are used to calculate the profile intensity. The integrated intensity parameters are, however, not varied during the least-squares fitting, while the scale factors for the profile intensities of individual phases are adjusted instead. Weight fractions are obtained by solving simultaneous equations, coefficients of which include the scale factors and the mass-absorption coefficients calculated from chemical formulas of respective phases. The procedure can be applied to all mixture samples, including those containing an amorphous material, if single-component samples with known chemical compositions and their approximate unit-cell parameters are provided. The procedure has been tested by using two-to five-component samples, giving average deviations of 1 to 1.5%. Optimum refinement conditions are discussed in connection with the accuracy of the procedure. (orig.)

  14. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. Optimization of an Optical Inspection System Based on the Taguchi Method for Quantitative Analysis of Point-of-Care Testing

    Directory of Open Access Journals (Sweden)

    Chia-Hsien Yeh

    2014-09-01

    Full Text Available This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG strips (cut-off value of the hCG commercial product is 25 mIU/mL were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N, the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP.

  16. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Stability indicating HPLC-DAD method for analysis of Ketorolac binary and ternary mixtures in eye drops: Quantitative analysis in rabbit aqueous humor.

    Science.gov (United States)

    El Yazbi, Fawzy A; Hassan, Ekram M; Khamis, Essam F; Ragab, Marwa A A; Hamdy, Mohamed M A

    2017-11-15

    Ketorolac tromethamine (KTC) with phenylephrine hydrochloride (PHE) binary mixture (mixture 1) and their ternary mixture with chlorpheniramine maleate (CPM) (mixture 2) were analyzed using a validated HPLC-DAD method. The developed method was suitable for the in vitro as well as quantitative analysis of the targeted mixtures in rabbit aqueous humor. The analysis in dosage form (eye drops) was a stability indicating one at which drugs were separated from possible degradation products arising from different stress conditions (in vitro analysis). For analysis in aqueous humor, Guaifenesin (GUF) was used as internal standard and the method was validated according to FDA regulation for analysis in biological fluids. Agilent 5 HC-C18(2) 150×4.6mm was used as stationary phase with a gradient eluting solvent of 20mM phosphate buffer pH 4.6 containing 0.2% triethylamine and acetonitrile. The drugs were resolved with retention times of 2.41, 5.26, 7.92 and 9.64min for PHE, GUF, KTC and CPM, respectively. The method was sensitive and selective to analyze simultaneously the three drugs in presence of possible forced degradation products and dosage form excipients (in vitro analysis) and also with the internal standard, in presence of aqueous humor interferences (analysis in biological fluid), at a single wavelength (261nm). No extraction procedure was required for analysis in aqueous humor. The simplicity of the method emphasizes its capability to analyze the drugs in vivo (in rabbit aqueous humor) and in vitro (in pharmaceutical formulations). Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  19. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  20. Quantitative Concept Analysis

    NARCIS (Netherlands)

    Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas

    2012-01-01

    Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all

  1. Quantitative analysis of surface deformation and ductile flow in complex analogue geodynamic models based on PIV method.

    Science.gov (United States)

    Krýza, Ondřej; Lexa, Ondrej; Závada, Prokop; Schulmann, Karel; Gapais, Denis; Cosgrove, John

    2017-04-01

    Recently, a PIV (particle image velocimetry) analysis method is optical method abundantly used in many technical branches where material flow visualization and quantification is important. Typical examples are studies of liquid flow through complex channel system, gas spreading or combustion problematics. In our current research we used this method for investigation of two types of complex analogue geodynamic and tectonic experiments. First class of experiments is aimed to model large-scale oroclinal buckling as an analogue of late Paleozoic to early Mesozoic evolution of Central Asian Orogenic Belt (CAOB) resulting from nortward drift of the North-China craton towards the Siberian craton. Here we studied relationship between lower crustal and lithospheric mantle flows and upper crustal deformation respectively. A second class of experiments is focused to more general study of a lower crustal flow in indentation systems that represent a major component of some large hot orogens (e.g. Bohemian massif). The most of simulations in both cases shows a strong dependency of a brittle structures shape, that are situated in upper crust, on folding style of a middle and lower ductile layers which is influenced by rheological, geometrical and thermal conditions of different parts across shortened domain. The purpose of PIV application is to quantify material redistribution in critical domains of the model. The derivation of flow direction and calculation of strain-rate and total displacement field in analogue experiments is generally difficult and time-expensive or often performed only on a base of visual evaluations. PIV method operates with set of images, where small tracer particles are seeded within modeled domain and are assumed to faithfully follow the material flow. On base of pixel coordinates estimation the material displacement field, velocity field, strain-rate, vorticity, tortuosity etc. are calculated. In our experiments we used velocity field divergence to

  2. Characteristic Chromatogram: A Method of Discriminate and Quantitative Analysis for Quality Evaluation of Uncaria Stem with Hooks.

    Science.gov (United States)

    Hou, Jinjun; Feng, Ruihong; Zhang, Yibei; Pan, Huiqin; Yao, Shuai; Han, Sumei; Feng, Zijin; Cai, Luying; Wu, Wanying; Guo, De-An

    2018-04-01

    It remains a challenge to establish new monographs for herbal drugs derived from multiple botanical sources. Specifically, the difficulty involves discriminating and quantifying these herbs with components whose levels vary markedly among different samples. Using Uncaria stem with hooks as an example, a characteristic chromatogram was proposed to discriminate its five botanical origins and to quantify its characteristic components in the chromatogram. The characteristic chromatogram with respect to the components of Uncaria stem with hooks with the five botanical origins was established using 0.02% diethylamine and acetonitrile as the mobile phase. The total analysis time was 50 min and the detection wavelength was 245 nm. Using the same chromatogram parameters, the single standard to determine multicomponents method was validated to simultaneously quantify nine indole alkaloids, including vincosamide, 3 α -dihydrocadambine, isocorynoxeine, corynoxeine, isorhynchophylline, rhynchophylline, hirsuteine, hirsutine, and geissoschizine methyl ether. The results showed that only the Uncaria stem with hooks from Uncaria rhynchophylla , the most widely used in the herbal market, showed the presence of these nine alkaloids. The conversion factors were 1.27, 2.32, 0.98, 1.04, 1.00, 1.02, 1.26, 1.33, and 1.25, respectively. The limits of quantitation were lower than 700 ng/mL. The total contents of 31 batches of Uncaria stem with hooks were in the range of 0.1 - 0.6%, except for Uncaria hirsuta Havil and Uncaria sinensis (Oliv.) Havil. The results also showed that the total content of indole alkaloids tended to decrease with an increase in the hook diameter. This showed that the characteristic chromatogram is practical for controlling the quality of traditional Chinese medicines with multiple botanical origins. Georg Thieme Verlag KG Stuttgart · New York.

  3. LC-MS/MS method development for quantitative analysis of acetaminophen uptake by the aquatic fungus Mucor hiemalis.

    Science.gov (United States)

    Esterhuizen-Londt, Maranda; Schwartz, Katrin; Balsano, Evelyn; Kühn, Sandra; Pflugmacher, Stephan

    2016-06-01

    Acetaminophen is a pharmaceutical, frequently found in surface water as a contaminant. Bioremediation, in particular, mycoremediation of acetaminophen is a method to remove this compound from waters. Owing to the lack of quantitative analytical method for acetaminophen in aquatic organisms, the present study aimed to develop a method for the determination of acetaminophen using LC-MS/MS in the aquatic fungus Mucor hiemalis. The method was then applied to evaluate the uptake of acetaminophen by M. hiemalis, cultured in pellet morphology. The method was robust, sensitive and reproducible with a lower limit of quantification of 5 pg acetaminophen on column. It was found that M. hiemalis internalize the pharmaceutical, and bioaccumulate it with time. Therefore, M. hiemalis was deemed a suitable candidate for further studies to elucidate its pharmaceutical tolerance and the longevity in mycoremediation applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. THE DEVELOPMENT OF METHOD FOR MINT AND TURMERIC ESSENTIAL OILS IDENTIFICATION AND QUANTITATIVE ANALYSIS IN COMPLEX DRUG

    Directory of Open Access Journals (Sweden)

    O. G. Smalyuh

    2015-04-01

    Full Text Available The aim of our study was to develop the method for identification and assay of essential oils of mint and turmeric in complex medicinal product in capsule form. Materials and method.The paper used samples of turmeric and mint essential oils and complex drug, in the form of capsules containing oil of peppermint, oil of Curcuma longa, a mixture of extracts sandy everlasting (Helichrysumarenarium (L. Moench, marigold (Caléndulaofficinális L, wild carrot (Daucussarota and Curcuma longa (Curcuma longa. Results and discussion. The structure of the complex drug is dry extract sand everlasting flowers, wild carrot extract of marigold flowers and fruits thick, dry extract of Curcuma longa and essential oils of peppermint and turmeric. According to the research of different samples of peppermint oil, and given the need for its identification and quantification of the finished medicinal product, we have decided to choose menthol as analytical marker. In order to establish the identity of complex drug its main components - Ar- turmeric, α-and β- turmeric, and their total content must meet the quantitative indicators "content turmerics" in the specifications for turmeric oil. Past studies of sample preparation conditions allowed to offer 96% ethanol to extract oil components from the sample; ultrasonic and centrifugation to improve removal of the capsule weight. Cromatographiccharacteristics of substances was obtained by column firm Agilent, HP-Innowax. It has been established that other active pharmaceutical ingredients capsule (placebo did not affect the quantification of the components of essential oils of mint and turmeric. Conclusions. 1. Chromatographic conditions of identification and assay of essential oils of mint and turmeric in a complex drug and optimal conditions for sample preparation and analysis by gas chromatographyhave been studied. 2. Methods for identification and assay of menthol, α-, β- and Ar- turmerics in complex drug based on

  5. Improving Student Understanding of Qualitative and Quantitative Analysis via GC/MS Using a Rapid SPME-Based Method for Determination of Trihalomethanes in Drinking Water

    Science.gov (United States)

    Huang, Shu Rong; Palmer, Peter T.

    2017-01-01

    This paper describes a method for determination of trihalomethanes (THMs) in drinking water via solid-phase microextraction (SPME) GC/MS as a means to develop and improve student understanding of the use of GC/MS for qualitative and quantitative analysis. In the classroom, students are introduced to SPME, GC/MS instrumentation, and the use of MS…

  6. Development and interlaboratory validation of quantitative polymerase chain reaction method for screening analysis of genetically modified soybeans.

    Science.gov (United States)

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2013-01-01

    A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.

  7. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    Science.gov (United States)

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi

  8. Reported prevalence and quantitative LC-MS methods for the analysis of veterinary drug residues in honey: a review.

    Science.gov (United States)

    Venable, Ryan; Haynes, Carion; Cook, Jo Marie

    2014-04-01

    Insect pollination increases the value and productivity of three-quarters of crop species grown for food. Declining beehive health in commercial apiaries has resulted in numerous reports from government laboratories worldwide of contamination with antimicrobial chemicals in honey. This review includes pertinent discussion of legislation and events leading to increased government oversight in the commercial honey market. A detailed summary of the variety and prevalence of veterinary drug residues being found in honey as well as a selection of robust quantitative and confirmatory LC-MS methods with an emphasis on those adopted by government testing laboratories are presented.

  9. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  10. Texture analysis of pulmonary parenchymateous changes related to pulmonary thromboembolism in dogs - a novel approach using quantitative methods

    DEFF Research Database (Denmark)

    Marschner, Clara Büchner; Kokla, Marietta; Amigo Rubio, Jose Manuel

    2017-01-01

    include dual energy computed tomography (DECT) as well as computer assisted diagnosis (CAD) techniques. The purpose of this study was to investigate the performance of quantitative texture analysis for detecting dogs with PTE using grey-level co-occurrence matrices (GLCM) and multivariate statistical...... classification analyses. CT images from healthy (n = 6) and diseased (n = 29) dogs with and without PTE confirmed on CTPA were segmented so that only tissue with CT numbers between −1024 and −250 Houndsfield Units (HU) was preserved. GLCM analysis and subsequent multivariate classification analyses were...... using GLCM is an effective tool for distinguishing healthy from abnormal lung. Furthermore the texture of pulmonary parenchyma in dogs with PTE is altered, when compared to the texture of pulmonary parenchyma of healthy dogs. The models’ poorer performance in classifying dogs within the diseased group...

  11. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  12. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  13. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative ... 2. Basic principles. The mineralogical constitution of soil is rather complex. ... K2O, MgO, and TFe as variables for the calculation.

  14. Quantitative analysis and efficiency study of PSD methods for a LaBr{sub 3}:Ce detector

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ming; Cang, Jirong [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Zeng, Zhi, E-mail: zengzhi@tsinghua.edu.cn [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2016-03-21

    The LaBr{sub 3}:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr{sub 3}:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the {sup 227}Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr{sub 3}:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  15. Quantitative fluorescence kinetic analysis of NADH and FAD in human plasma using three- and four-way calibration methods capable of providing the second-order advantage

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chao [School of Chemistry and Chemical Engineering, Guizhou University, Guiyang 550025 (China); Wu, Hai-Long, E-mail: hlwu@hnu.edu.cn [State Key Laboratory of Chemo/Biosensing and Chemometrics, College of Chemistry and Chemical Engineering, Hunan University, Changsha 410082 (China); Zhou, Chang; Xiang, Shou-Xia; Zhang, Xiao-Hua; Yu, Yong-Jie; Yu, Ru-Qin [State Key Laboratory of Chemo/Biosensing and Chemometrics, College of Chemistry and Chemical Engineering, Hunan University, Changsha 410082 (China)

    2016-03-03

    The metabolic coenzymes reduced nicotinamide adenine dinucleotide (NADH) and flavin adenine dinucleotide (FAD) are the primary electron donor and acceptor respectively, participate in almost all biological metabolic pathways. This study develops a novel method for the quantitative kinetic analysis of the degradation reaction of NADH and the formation reaction of FAD in human plasma containing an uncalibrated interferent, by using three-way calibration based on multi-way fluorescence technique. In the three-way analysis, by using the calibration set in a static manner, we directly predicted the concentrations of both analytes in the mixture at any time after the start of their reactions, even in the presence of an uncalibrated spectral interferent and a varying background interferent. The satisfactory quantitative results indicate that the proposed method allows one to directly monitor the concentration of each analyte in the mixture as the function of time in real-time and nondestructively, instead of determining the concentration after the analytical separation. Thereafter, we fitted the first-order rate law to their concentration data throughout their reactions. Additionally, a four-way calibration procedure is developed as an alternative for highly collinear systems. The results of the four-way analysis confirmed the results of the three-way analysis and revealed that both the degradation reaction of NADH and the formation reaction of FAD in human plasma fit the first-order rate law. The proposed methods could be expected to provide promising tools for simultaneous kinetic analysis of multiple reactions in complex systems in real-time and nondestructively. - Highlights: • A novel three-way calibration method for the quantitative kinetic analysis of NADH and FAD in human plasma is proposed. • The method can directly monitor the concentration of each analyte in the reaction in real-time and nondestructively. • The method has the second-order advantage. • A

  16. Quantitative fluorescence kinetic analysis of NADH and FAD in human plasma using three- and four-way calibration methods capable of providing the second-order advantage

    International Nuclear Information System (INIS)

    Kang, Chao; Wu, Hai-Long; Zhou, Chang; Xiang, Shou-Xia; Zhang, Xiao-Hua; Yu, Yong-Jie; Yu, Ru-Qin

    2016-01-01

    The metabolic coenzymes reduced nicotinamide adenine dinucleotide (NADH) and flavin adenine dinucleotide (FAD) are the primary electron donor and acceptor respectively, participate in almost all biological metabolic pathways. This study develops a novel method for the quantitative kinetic analysis of the degradation reaction of NADH and the formation reaction of FAD in human plasma containing an uncalibrated interferent, by using three-way calibration based on multi-way fluorescence technique. In the three-way analysis, by using the calibration set in a static manner, we directly predicted the concentrations of both analytes in the mixture at any time after the start of their reactions, even in the presence of an uncalibrated spectral interferent and a varying background interferent. The satisfactory quantitative results indicate that the proposed method allows one to directly monitor the concentration of each analyte in the mixture as the function of time in real-time and nondestructively, instead of determining the concentration after the analytical separation. Thereafter, we fitted the first-order rate law to their concentration data throughout their reactions. Additionally, a four-way calibration procedure is developed as an alternative for highly collinear systems. The results of the four-way analysis confirmed the results of the three-way analysis and revealed that both the degradation reaction of NADH and the formation reaction of FAD in human plasma fit the first-order rate law. The proposed methods could be expected to provide promising tools for simultaneous kinetic analysis of multiple reactions in complex systems in real-time and nondestructively. - Highlights: • A novel three-way calibration method for the quantitative kinetic analysis of NADH and FAD in human plasma is proposed. • The method can directly monitor the concentration of each analyte in the reaction in real-time and nondestructively. • The method has the second-order advantage. • A

  17. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  18. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  19. The analysis of quantitative methods for renewable fuel processes and lubricant of materials derived from plastic waste

    Science.gov (United States)

    Rajagukguk, J. R.

    2018-01-01

    Plastic has become an important component in modern life today. Its role has replaced wood and metal, given its advantages such as light and strong, corrosion resistant, transparent and easy to color and good insulation properties. The research method is used with quantitative and engineering research methods. Research objective is to convert plastic waste into something more economical and to preserve the environment surrounding. Renewable fuel and lubricant variables are simultaneously influenced significantly to the sustainable environment. This is based on Fh> Ft of 62.101> 4.737) and its significance is 0.000 sustainable, the value of correlation coefficient 0.941 or 94.1% which means there is a very strong relationship between renewable fuel variables and lubricants to the sustainable environment. And utilizing plastic waste after being processed by pyrolysis method produces liquid hydrocarbons having elements of compounds such as crude oil and renewable fuels obtained from calculations are CO2 + H2O + C1-C4 + Residual substances. Then the plastic waste can be processed by isomerization process + catalyst to lubricating oil and the result of chemical calculation obtained is CO2, H2O, C18H21 and the rest.

  20. Qualitative and quantitative analysis of specific polysaccharides in Dendrobium huoshanense by using saccharide mapping and chromatographic methods.

    Science.gov (United States)

    Deng, Yong; Chen, Ling-Xiao; Han, Bang-Xing; Wu, Ding-Tao; Cheong, Kit-Leong; Chen, Nai-Fu; Zhao, Jing; Li, Shao-Ping

    2016-09-10

    Qualitative and quantitative analysis of specific polysaccharides from ten batches of Dendrobium huoshanense were performed using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector (HPSEC-MALLS-RID), gas chromatography-mass spectrometry (GC-MS), nuclear magnetic resonance (NMR) and saccharide mapping based on polysaccharides analysis by using carbohydrate gel electrophoresis (PACE) and high performance thin layer chromatography (HPTLC). Results showed that molecular weights, the radius of gyrations, and contents of specific polysaccharides in D. huoshanense were ranging from 1.16×10(5) to 2.17×10(5)Da, 38.8 to 52.1nm, and 9.9% to 19.9%, respectively. Furthermore, the main monosaccharide compositions were Man and Glc. Indeed, the main glycosidic linkages were β-1,4-Manp and β-1,4-Glcp, and substituted with acetyl groups at O-2 and O-3 of 1,4-linked Manp. Moreover, results showed that PACE and HPTLC fingerprints of partial acidic and enzymatic hydrolysates of specific polysaccharides were similar, which are helpful to better understand the specific polysaccharides in D. huoshanense and beneficial to improve their quality control. These approaches could also be routinely used for quality control of polysaccharides in other medicinal plants. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  2. A simultaneous screening and quantitative method for the multiresidue analysis of pesticides in spices using ultra-high performance liquid chromatography-high resolution (Orbitrap) mass spectrometry.

    Science.gov (United States)

    Goon, Arnab; Khan, Zareen; Oulkar, Dasharath; Shinde, Raviraj; Gaikwad, Suresh; Banerjee, Kaushik

    2018-01-12

    A novel screening and quantitation method is reported for non-target multiresidue analysis of pesticides using ultra-HPLC-quadrupole-Orbitrap mass spectrometry in spice matrices, including black pepper, cardamom, chili, coriander, cumin, and turmeric. The method involved sequential full-scan (resolution = 70,000), and variable data independent acquisition (vDIA) with nine consecutive fragmentation events (resolution = 17,500). Samples were extracted by the QuEChERS method. The introduction of an SPE-based clean-up step through hydrophilic-lipophilic-balance (HLB) cartridges proved advantageous in minimizing the false negatives. For coriander, cumin, chili, and cardamom, the screening detection limit was largely at 2 ng/g, while it was 5 ng/g for black pepper, and turmeric. When the method was quantitatively validated for 199 pesticides, the limit of quantification (LOQ) was mostly at 10 ng/g (excluding black pepper, and turmeric with LOQ = 20 ng/g) with recoveries within 70-120%, and precision-RSDs <20%. Furthermore, the method allowed the identification of suspected non-target analytes through retrospective search of the accurate mass of the compound-specific precursor and product ions. Compared to LC-MS/MS, the quantitative performance of this Orbitrap-MS method had agreements in residue values between 78-100%. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  4. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  5. Comparing Online with Brick and Mortar Course Learning Outcomes: An Analysis of Quantitative Methods Curriculum in Public Administration

    Science.gov (United States)

    Harris, Ronald A.; Nikitenko, Gleb O.

    2014-01-01

    Teaching graduate students in an intensive adult-learning format presents a special challenge for quantitative analytical competencies. Students often lack necessary background, skills and motivation to deal with quantitative-skill-based course work. This study compares learning outcomes for graduate students enrolled in three course sections…

  6. Texture analysis of pulmonary parenchymateous changes related to pulmonary thromboembolism in dogs - a novel approach using quantitative methods.

    Science.gov (United States)

    Marschner, C B; Kokla, M; Amigo, J M; Rozanski, E A; Wiinberg, B; McEvoy, F J

    2017-07-11

    Diagnosis of pulmonary thromboembolism (PTE) in dogs relies on computed tomography pulmonary angiography (CTPA), but detailed interpretation of CTPA images is demanding for the radiologist and only large vessels may be evaluated. New approaches for better detection of smaller thrombi include dual energy computed tomography (DECT) as well as computer assisted diagnosis (CAD) techniques. The purpose of this study was to investigate the performance of quantitative texture analysis for detecting dogs with PTE using grey-level co-occurrence matrices (GLCM) and multivariate statistical classification analyses. CT images from healthy (n = 6) and diseased (n = 29) dogs with and without PTE confirmed on CTPA were segmented so that only tissue with CT numbers between -1024 and -250 Houndsfield Units (HU) was preserved. GLCM analysis and subsequent multivariate classification analyses were performed on texture parameters extracted from these images. Leave-one-dog-out cross validation and receiver operator characteristic (ROC) showed that the models generated from the texture analysis were able to predict healthy dogs with optimal levels of performance. Partial Least Square Discriminant Analysis (PLS-DA) obtained a sensitivity of 94% and a specificity of 96%, while Support Vector Machines (SVM) yielded a sensitivity of 99% and a specificity of 100%. The models, however, performed worse in classifying the type of disease in the diseased dog group: In diseased dogs with PTE sensitivities were 30% (PLS-DA) and 38% (SVM), and specificities were 80% (PLS-DA) and 89% (SVM). In diseased dogs without PTE the sensitivities of the models were 59% (PLS-DA) and 79% (SVM) and specificities were 79% (PLS-DA) and 82% (SVM). The results indicate that texture analysis of CTPA images using GLCM is an effective tool for distinguishing healthy from abnormal lung. Furthermore the texture of pulmonary parenchyma in dogs with PTE is altered, when compared to the texture of pulmonary parenchyma

  7. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  8. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  9. Quantitative analysis of active compounds in pharmaceutical preparations by use of attenuated total-reflection Fourier transform mid-infrared spectrophotometry and the internal standard method.

    Science.gov (United States)

    Sastre Toraño, J; van Hattum, S H

    2001-10-01

    A new method is presented for the quantitative analysis of compounds in pharmaceutical preparations Fourier transform (FT) mid-infrared (MIR) spectroscopy with an attenuated total reflection (ATR) module. Reduction of the quantity of overlapping absorption bands, by interaction of the compound of interest with an appropriate solvent, and the employment of an internal standard (IS), makes MIR suitable for quantitative analysis. Vigabatrin, as active compound in vigabatrin 100-mg capsules, was used as a model compound for the development of the method. Vigabatrin was extracted from the capsule content with water after addition of a sodium thiosulfate IS solution. The extract was concentrated by volume reduction and applied to the FTMIR-ATR module. Concentrations of unknown samples were calculated from the ratio of the vigabatrin band area (1321-1610 cm(-1)) and the IS band area (883-1215 cm(-1)) using a calibration standard. The ratio of the area of the vigabatrin peak to that of the IS was linear with the concentration in the range of interest (90-110 mg, in twofold; n=2). The accuracy of the method in this range was 99.7-100.5% (n=5) with a variability of 0.4-1.3% (n=5). The comparison of the presented method with an HPLC assay showed similar results; the analysis of five vigabatrin 100-mg capsules resulted in a mean concentration of 102 mg with a variation of 2% with both methods.

  10. Quantitative trace analysis of polyfluorinated alkyl substances (PFAS) in ambient air samples from Mace Head (Ireland): A method intercomparison

    Science.gov (United States)

    Jahnke, Annika; Barber, Jonathan L.; Jones, Kevin C.; Temme, Christian

    A method intercomparison study of analytical methods for the determination of neutral, volatile polyfluorinated alkyl substances (PFAS) was carried out in March, 2006. Environmental air samples were collected in triplicate at the European background site Mace Head on the west coast of Ireland, a site dominated by 'clean' westerly winds coming across the Atlantic. Extraction and analysis were performed at two laboratories active in PFAS research using their in-house methods. Airborne polyfluorinated telomer alcohols (FTOHs), fluorooctane sulfonamides and sulfonamidoethanols (FOSAs/FOSEs) as well as additional polyfluorinated compounds were investigated. Different native and isotope-labelled internal standards (IS) were applied at various steps in the analytical procedure to evaluate the different quantification strategies. Field blanks revealed no major blank problems. European background concentrations observed at Mace Head were found to be in a similar range to Arctic data reported in the literature. Due to trace-levels at the remote site, only FTOH data sets were complete and could therefore be compared between the laboratories. Additionally, FOSEs could partly be included. Data comparison revealed that despite the challenges inherent in analysis of airborne PFAS and the low concentrations, all methods applied in this study obtained similar results. However, application of isotope-labelled IS early in the analytical procedure leads to more precise results and is therefore recommended.

  11. A Rapid, Simple, and Validated RP-HPLC Method for Quantitative Analysis of Levofloxacin in Human Plasma

    Directory of Open Access Journals (Sweden)

    Dion Notario

    2017-04-01

    Full Text Available To conduct a bioequivalence study for a copy product of levofloxacin (LEV, a simple and validated analytical method was needed, but the previous developed methods were still too complicated. For this reason, a simple and rapid high performance liquid chromatography method was developed and validated for LEV quantification in human plasma. Chromatographic separation was performed under isocratic elution on a Luna Phenomenex® C18 (150 × 4.6 mm, 5 µm column. The mobile phase was comprised of acetonitrile, methanol, and phosphate buffer 25 mM that adjusted at pH 3.0 (13:7:80 v/v/v and pumped at a flow rate of 1.5 mL/min. Detection was performed under UV detector at wavelength of 280 nm. Samples were prepared by adding acetonitrile and followed by centrifugation to precipitate plasma protein. Then followed successively by evaporation and reconstitution step. The optimized method meets the requirements of validation parameters which included linearity (r = 0.995, sensitivity (LLOQ and LOD was 1.77 and 0.57 µg/mL respectively, accuracy (%error above LLOQ ≤ 12% and LLOQ ≤ 20%, precision (RSD ≤ 9%, and robustness in the ranges of 1.77-28.83 µg/mL. Therefore, the method can be used as a routine analysis of LEV in human plasma as well as in bioequivalence study of LEV.

  12. Multianalytical Method Validation for Qualitative and Quantitative Analysis of Solvents of Abuse in Oral Fluid by HS-GC/MS

    Directory of Open Access Journals (Sweden)

    Bruna Claudia Coppe

    2016-01-01

    Full Text Available The use of oral fluid as a biological matrix to monitor the use of drugs of abuse is a global trend because it presents several advantages and good correlation to the blood level. Thus, the present work aimed to develop and validate an analytical method for quantification and detection of solvents used as inhalants of abuse in oral fluid (OF, using Quantisal™ as collector device by headspace and gas chromatography coupled with a mass detector (HS-GC/MS. Chromatographic separation was performed with a ZB-BAC1 column and the total time of analysis was 11.8 min. The method showed good linearity (correlation coefficient higher than 0.99 for all solvents. The limits of detection ranged from 0.05 to 5 mg/L, while the lower limits of quantification ranged from 2.5 to 12.5 mg/L. Accuracy, precision, matrix effect, and residual effect presented satisfactory results, meeting the criteria accepted for the validation of bioanalytical methods. The method showed good selectivity considering that, for solvents coeluting at the same retention time, resolution was performed by the mass detector. The method developed proved to be adequate when applied in OF samples from users of drugs and may be used to monitor the abuse of inhalants in routine forensic analyses.

  13. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  14. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  15. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  16. Study on the method for determination of the nuclear fuel elements burnout by the quantitative analysis of 148Nd

    International Nuclear Information System (INIS)

    Enoshita, Margarida

    1978-01-01

    The scope of present work to study the precision and accuracy of a method for separation and determination of the.stable isotope 148 Nd. The extraction chromatography and ion exchange techniques were used for the separation of fission products and uranium. Kieselguhr and di(2-ethyl-hexyl)phosphoric acid were used as inert support and stationary phase, respectively, in the steps of the separation procedure where the extraction chromatography technique was applied, and anionic resin mixed with Pb0 2 was used for the ion exchange operation. The behaviour of each element in the separation procedure was verified by means of radioactive tracers, namely, 147 Nd, 141 Ce and 137 Cs. Use was made of the thermal neutron activation analysis in order to determine the 148 Nd percentage recovered after the separation procedure was run. On the other hand, the radioactive isotope 147 Nd was used to find chemical yield achieved for the separation procedure. Student's t test applied to verify the accuracy of the method used whose acceptability was verified by using the criterion recommended by McFarren and also taking into account the suggestion proposed by Eckschlager. The precision of the method was verified by means of the standard deviation of the several determinations. (author)

  17. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  18. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    Science.gov (United States)

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  19. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  20. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    Energy Technology Data Exchange (ETDEWEB)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-03-15

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  1. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Patriarca, Riccardo; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-01-01

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  2. A simple method for the quantitative analysis of tyrosol by hplc in liquid Czapek Cultures from endophytic fungi

    International Nuclear Information System (INIS)

    Guimaraes, Denise O.; Pupo, Monica T.; Borges, Keyller B.; Bonato, Pierina S.

    2009-01-01

    Tyrosol is a possible quorum sensing molecule in endophytic fungi. High-performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was used for the analysis of tyrosol in liquid Czapek fungal cultures. The optimized conditions were gradient mobile phase, in linear mode, consisting initially of acetonitrile/water (1:9 v/v) and increasing up to acetonitrile (100%) in 30 minutes at a flow rate of 1 mL min -1 . The column used was a Zorbax ODS (250 x 4.6 mm, 5 μm) at 25 deg C. Liquid-liquid extraction of 0.5 mL medium (pH 7.0) with ethyl acetate and injection of 20 μL after solvent evaporation under air flow gave good results. Some validation parameters obtained were: linearity 0.0125-5.0 μg mL -1 medium (r = 0.9967), quantification limit of 0.0125 μg mL -1 medium, %CV (precision) and %E (accuracy) bellow 15% and recovery around 80%. Therefore, the developed method presented satisfactory validation parameters and it was efficient for the analysis of tyrosol in Czapek medium. (author)

  3. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    Kojima, Shigeo; Onoue, Akira; Kawai, Katsunori

    1998-01-01

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  4. The evaluation of usefulness of quantitative analysis method with {sup 99m}Tc-tetrofosmin on effective dicision of reperfusion therapy to acute myocardial infarction

    Energy Technology Data Exchange (ETDEWEB)

    Takagi, Hitoshi; Sone, Takahito [Ogaki Municipal Hospital, Gifu (Japan)

    1998-01-01

    SPECT on acute and chronic period of {sup 99m}Tc-tetrofosmin on 46 patients with acute myocardial infarction were analyzed to evaluate about usefulness of quantitative analysis method used by unfolding image that was gotten by Bull`s eye analysis on reperfusion therapy in acute myocardial infarction, and we could get undermentioned results. 60% black out area in resional of interested was best on the obstacle myocardium using analysis of unfolding image. Significantly betterment from acute to chronic phase on the obstacle area of myocardium and the mean uptake ratio of the obstacle area was confirmed by this method. The relation between myocardial salvage and factors of myocardial damage, for example reperfusion time TIMI grade, rentrop grade and reperfusion phenomenon could be analyzed by this method. This method was suspected to underestimate the areas of obstacle myocardium on the cases of defect of apex. Error of analysis was suspected on the case of accumulation of TF to other than myocardium. The problems were minimised by some techniques mention in this paper. (author)

  5. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    Science.gov (United States)

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. PETROGRAPHY AND APPLICATION OF THE RIETVELD METHOD TO THE QUANTITATIVE ANALYSIS OF PHASES OF NATURAL CLINKER GENERATED BY COAL SPONTANEOUS COMBUSTION

    Directory of Open Access Journals (Sweden)

    Pinilla A. Jesús Andelfo

    2010-06-01

    Full Text Available

    Fine-grained and mainly reddish color, compact and slightly breccious and vesicular pyrometamorphic rocks (natural clinker are associated to the spontaneous combustion of coal seams of the Cerrejón Formation exploited by Carbones del Cerrejón Limited in La Guajira Peninsula (Caribbean Region of Colombia. These rocks constitute remaining inorganic materials derived from claystones, mudstones and sandstones originally associated with the coal and are essentially a complex mixture of various amorphous and crystalline inorganic constituents. In this paper, a petrographic characterization of natural clinker, aswell as the application of the X-ray diffraction (Rietveld method by mean of quantitative analysis of its mineral phases were carried out. The RIQAS program was used for the refinement of X ray powder diffraction profiles, analyzing the importance of using the correct isostructural models for each of the existing phases, which were obtained from the Inorganic Crystal Structure Database (ICSD. The results obtained in this investigation show that the Rietveld method can be used as a powerful tool in the quantitative analysis of phases in polycrystalline samples, which has been a traditional problem in geology.

  7. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  8. Quantitative phase analysis of alumina/calcium-hexaluminate composites using neutron diffraction data and the Rietveld method

    International Nuclear Information System (INIS)

    Asmi, D.; Low, I.M.; O'Connor, B.H.; Kennedy, S.J.

    2000-01-01

    Full text: The Al 2 O 3 -CaO system is the basis of an important class of high-temperature refractories in the steel industry. It contains a number of stable intermediate compounds which include C 3 A, C 12 A 7 , CA, CA 2 , and CA 6 . These calcium aluminates are also important constituents of high alumina cement and have been used to produce high-strength and high-toughness ceramic-polymer composite materials. More recently, alumina composites containing 30 wt% CA 6 platelets have been developed by An et al which show characteristics of self-reinforcement and enhanced toughening through crack-bridging. In this paper, we describe the use of high-temperature neutron diffraction to monitor the in-situ phase formation and abundances of calcium aluminates (CA, CA 2 , and CA 6 ) in alumina composites containing 5-50 wt % CA 6 .at temperatures in the range 1000 - 1600 deg C. These composites were produced using reaction sintering of alumina and calcium oxide. For comparison purposes, control samples of pure α-alumina and CA 6 were also produced. Determination of relative phase abundances in these materials has been performed using the standardless Rietveld refinement method. Results show that the relative phase abundance of calcium aluminates in the composites increased with temperature and in proportion with the amount of calcium oxide present. The formation temperatures of CA, CA 2 , and CA 6 have been observed to occur at 1000 deg , 1200 deg, and ∼1350 deg C respectively, which agree well with results obtained from x-ray diffraction, synchrotron radiation diffraction and differential thermal analysis

  9. Quantitative analysis of unconjugated and total bisphenol A in human urine using solid-phase extraction and UPLC-MS/MS: method implementation, method qualification and troubleshooting.

    Science.gov (United States)

    Buscher, Brigitte; van de Lagemaat, Dick; Gries, Wolfgang; Beyer, Dieter; Markham, Dan A; Budinsky, Robert A; Dimond, Stephen S; Nath, Rajesh V; Snyder, Stephanie A; Hentges, Steven G

    2015-11-15

    The aim of the presented investigation was to document challenges encountered during implementation and qualification of a method for bisphenol A (BPA) analysis and to develop and discuss precautions taken to avoid and to monitor contamination with BPA during sample handling and analysis. Previously developed and published HPLC-MS/MS methods for the determination of unconjugated BPA (Markham et al. Journal of Analytical Toxicology, 34 (2010) 293-303) [17] and total BPA (Markham et al. Journal of Analytical Toxicology, 38 (2014) 194-203) [20] in human urine were combined and transferred into another laboratory. The initial method for unconjugated BPA was developed and evaluated in two independent laboratories simultaneously. The second method for total BPA was developed and evaluated in one of these laboratories to conserve resources. Accurate analysis of BPA at sub-ppb levels is a challenging task as BPA is a widely used material and is ubiquitous in the environment at trace concentrations. Propensity for contamination of biological samples with BPA is reported in the literature during sample collection, storage, and/or analysis. Contamination by trace levels of BPA is so pervasive that even with extraordinary care, it is difficult to completely exclude the introduction of BPA into biological samples and, consequently, contamination might have an impact on BPA biomonitoring data. The applied UPLC-MS/MS method was calibrated from 0.05 to 25ng/ml. The limit of quantification was 0.1ng/ml for unconjugated BPA and 0.2ng/ml for total BPA, respectively, in human urine. Finally, the method was applied to urine samples derived from 20 volunteers. Overall, BPA can be analyzed in human urine with acceptable recovery and repeatability if sufficient measures are taken to avoid contamination throughout the procedure from sample collection until UPLC-MS/MS analysis. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Rapid quantitative analysis of individual anthocyanin content based on high-performance liquid chromatography with diode array detection with the pH differential method.

    Science.gov (United States)

    Wang, Huayin

    2014-09-01

    A new quantitative technique for the simultaneous quantification of the individual anthocyanins based on the pH differential method and high-performance liquid chromatography with diode array detection is proposed in this paper. The six individual anthocyanins (cyanidin 3-glucoside, cyanidin 3-rutinoside, petunidin 3-glucoside, petunidin 3-rutinoside, and malvidin 3-rutinoside) from mulberry (Morus rubra) and Liriope platyphylla were used for demonstration and validation. The elution of anthocyanins was performed using a C18 column with stepwise gradient elution and individual anthocyanins were identified by high-performance liquid chromatography with tandem mass spectrometry. Based on the pH differential method, the high-performance liquid chromatography peak areas of maximum and reference absorption wavelengths of anthocyanin extracts were conducted to quantify individual anthocyanins. The calibration curves for these anthocyanins were linear within the range of 10-5500 mg/L. The correlation coefficients (r(2)) all exceeded 0.9972, and the limits of detection were in the range of 1-4 mg/L at a signal-to-noise ratio ≥5 for these anthocyanins. The proposed quantitative analysis was reproducible with good accuracy of all individual anthocyanins ranging from 96.3 to 104.2% and relative recoveries were in the range 98.4-103.2%. The proposed technique is performed without anthocyanin standards and is a simple, rapid, accurate, and economical method to determine individual anthocyanin contents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. The flexibility of a generic LC-MS/MS method for the quantitative analysis of therapeutic proteins based on human immunoglobulin G and related constructs in animal studies.

    Science.gov (United States)

    Lanshoeft, Christian; Wolf, Thierry; Walles, Markus; Barteau, Samuel; Picard, Franck; Kretz, Olivier; Cianférani, Sarah; Heudi, Olivier

    2016-11-30

    An increasing demand of new analytical methods is associated with the growing number of biotherapeutic programs being prosecuted in the pharmaceutical industry. Whilst immunoassay has been the standard method for decades, a great interest in assays based on liquid chromatography tandem mass spectrometry (LC-MS/MS) is evolving. In this present work, the development of a generic method for the quantitative analysis of therapeutic proteins based on human immunoglobulin G (hIgG) in rat serum is reported. The method is based on four generic peptides GPSVFPLAPSSK (GPS), TTPPVLDSDGSFFLYSK (TTP), VVSVLTVLHQDWLNGK (VVS) and FNWYVDGVEVHNAK (FNW) originating from different parts of the fraction crystallizable (Fc) region of a reference hIgG1 (hIgG1A). A tryptic pellet digestion of rat serum spiked with hIgG1A and a stable isotope labeled protein (hIgG1B) used as internal standard (ISTD) was applied prior LC-MS/MS analysis. The upper limit of quantification was at 1000μg/mL. The lower limit of quantitation was for GPS, TTP and VVS at 1.00μg/mL whereas for FNW at 5.00μg/mL. Accuracy and precision data met acceptance over three days. The presented method was further successfully applied to the quantitative analysis of other hIgG1s (hIgG1C and hIgG1D) and hIgG4-based therapeutic proteins on spiked quality control (QC) samples in monkey and rat serum using calibration standards (Cs) prepared with hIgG1A in rat serum. In order to extend the applicability of our generic approach, a bispecific-bivalent hIgG1 (bb-hIgG1) and two lysine conjugated antibody-drug conjugates (ADC1 and ADC2) were incorporated as well. The observed values on spiked QC samples in monkey serum were satisfactory with GPS for the determination of bb-hIgG1 whereas the FNW and TTP peptides were suitable for the ADCs. Moreover, comparable mean concentration-time profiles were obtained from monkeys previously dosed intravenously with ADC2 measured against Cs samples prepared either with hIgG1A in rat serum

  12. Quality Assessment of Kumu Injection, a Traditional Chinese Medicine Preparation, Using HPLC Combined with Chemometric Methods and Qualitative and Quantitative Analysis of Multiple Alkaloids by Single Marker.

    Science.gov (United States)

    Wang, Ning; Li, Zhi-Yong; Zheng, Xiao-Li; Li, Qiao; Yang, Xin; Xu, Hui

    2018-04-09

    Kumu injection (KMI) is a common-used traditional Chinese medicine (TCM) preparation made from Picrasma quassioides (D. Don) Benn. rich in alkaloids. An innovative technique for quality assessment of KMI was developed using high performance liquid chromatography (HPLC) combined with chemometric methods and qualitative and quantitative analysis of multi-components by single marker (QAMS). Nigakinone (PQ-6, 5-hydroxy-4-methoxycanthin-6-one), one of the most abundant alkaloids responsible for the major pharmacological activities of Kumu, was used as a reference substance. Six alkaloids in KMI were quantified, including 6-hydroxy- β -carboline-1-carboxylic acid (PQ-1), 4,5-dimethoxycanthin-6-one (PQ-2), β -carboline-1-carboxylic acid (PQ-3), β -carboline-1-propanoic acid (PQ-4), 3-methylcanthin-5,6-dione (PQ-5), and PQ-6. Based on the outcomes of twenty batches of KMI samples, the contents of six alkaloids were used for further chemometric analysis. By hierarchical cluster analysis (HCA), radar plots, and principal component analysis (PCA), all the KMI samples could be categorized into three groups, which were closely related to production date and indicated the crucial influence of herbal raw material on end products of KMI. QAMS combined with chemometric analysis could accurately measure and clearly distinguish the different quality samples of KMI. Hence, QAMS is a feasible and promising method for the quality control of KMI.

  13. A method for volume determination of the orbit and its contents by high resolution axial tomography and quantitative digital image analysis.

    Science.gov (United States)

    Cooper, W C

    1985-01-01

    The various congenital and acquired conditions which alter orbital volume are reviewed. Previous investigative work to determine orbital capacity is summarized. Since these studies were confined to postmortem evaluations, the need for a technique to measure orbital volume in the living state is presented. A method for volume determination of the orbit and its contents by high-resolution axial tomography and quantitative digital image analysis is reported. This procedure has proven to be accurate (the discrepancy between direct and computed measurements ranged from 0.2% to 4%) and reproducible (greater than 98%). The application of this method to representative clinical problems is presented and discussed. The establishment of a diagnostic system versatile enough to expand the usefulness of computerized axial tomography and polytomography should add a new dimension to ophthalmic investigation and treatment.

  14. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  15. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  16. Quantitative study of Portland cement hydration by X-ray diffraction/Rietveld analysis and independent methods

    International Nuclear Information System (INIS)

    Scrivener, K.L.; Fuellmann, T.; Gallucci, E.; Walenta, G.; Bermejo, E.

    2004-01-01

    X-ray diffraction (XRD) is a powerful technique for the study of crystalline materials. The technique of Rietveld refinement now enables the amounts of different phases in anhydrous cementitious materials to be determined to a good degree of precision. This paper describes the extension of this technique to a pilot study of the hydration of a typical Portland cement. To validate this XRD-Rietveld analysis technique, its results were compared with independent measures of the same materials by the analysis of backscattered electron images (BSE/IA) and thermogravimetric analysis (TGA). In addition, the internal consistency of the measurements was studied by comparing the XRD estimates of the amounts of hydrates formed with the amounts expected to form from the XRD estimates of the amounts of anhydrous materials reacted

  17. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  18. Quantitative analysis of charge trapping and classification of sub-gap states in MoS2 TFT by pulse I-V method

    Science.gov (United States)

    Park, Junghak; Hur, Ji-Hyun; Jeon, Sanghun

    2018-04-01

    The threshold voltage instabilities and huge hysteresis of MoS2 thin film transistors (TFTs) have raised concerns about their practical applicability in next-generation switching devices. These behaviors are associated with charge trapping, which stems from tunneling to the adjacent trap site, interfacial redox reaction and interface and/or bulk trap states. In this report, we present quantitative analysis on the electron charge trapping mechanism of MoS2 TFT by fast pulse I-V method and the space charge limited current (SCLC) measurement. By adopting the fast pulse I-V method, we were able to obtain effective mobility. In addition, the origin of the trap states was identified by disassembling the sub-gap states into interface trap and bulk trap states by simple extraction analysis. These measurement methods and analyses enable not only quantitative extraction of various traps but also an understanding of the charge transport mechanism in MoS2 TFTs. The fast I-V data and SCLC data obtained under various measurement temperatures and ambient show that electron transport to neighboring trap sites by tunneling is the main charge trapping mechanism in thin-MoS2 TFTs. This implies that interfacial traps account for most of the total sub-gap states while the bulk trap contribution is negligible, at approximately 0.40% and 0.26% in air and vacuum ambient, respectively. Thus, control of the interface trap states is crucial to further improve the performance of devices with thin channels.

  19. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  20. Quantitative method for analysis of six anticoagulant rodenticides in faeces, applied in a case with repeated samples from a dog.

    Science.gov (United States)

    Seljetun, Kristin Opdal; Eliassen, Elin; Karinen, Ritva; Moe, Lars; Vindenes, Vigdis

    2018-01-17

    Accidental poisoning with anticoagulant rodenticides is not uncommon in dogs, but few reports of the elimination kinetics and half-lives in this species have been published. Our objectives were to develop and validate a new method for the quantification of anticoagulant rodenticides in canine blood and faeces using reversed phase ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) and apply the method on a case of anticoagulant rodenticide intoxication. Sample preparation was liquid-liquid extraction. Six anticoagulant rodenticides were separated using a UPLC ® BEH C 18 -column with a mobile phase consisting of 5 mM ammonium formate buffer pH 10.2 and methanol. MS/MS detection was performed with positive electrospray ionization and two multiple reaction monitoring transitions. The limits of quantification were set at the levels of the lowest calibrator (1.5-2.7 ng/mL or ng/g). The method was successfully applied to a case from a dog accidentally poisoned with anticoagulant rodenticide. Coumatetralyl and brodifacoum concentrations were determined from serial blood and faecal samples. A terminal half-life of at least 81 days for coumatetralyl in blood was estimated, which is longer than previous reported in other species. A slow elimination of brodifacoum from the faeces was found, with traces still detectable in the faeces at day 513. This study offers a new method of detection and quantification of six frequently used anticoagulant rodenticides in canine faeces. Such drugs might cause serious health effects and it is important to be able to detect these drugs, to initiate proper treatment. The very long elimination half-lives detected in our study is important to be aware of in assessment of anticoagulant rodenticide burden to the environment.

  1. Stability indicating RP-LC-PDA method for the quantitative analysis of saxagliptin in pharmaceutical dosage form

    Directory of Open Access Journals (Sweden)

    Laís Engroff Scheeren

    2015-06-01

    Full Text Available Saxagliptin is a potent and selective inhibitor of the enzyme dipeptidyl peptidase 4. It is effective in the treatment of type 2 diabetes mellitus because it stimulates the pancreas to produce insulin. In the present study, a liquid chromatography method was developed and validated to quantify the drug in tablets. This method was based on the isocratic elution of saxagliptin, using a mobile phase consisting of 0.1% phosphoric acid at pH 3.0 - methanol (70: 30, v/v at a flow rate of 1 mL.min-1 with UV detection at 225 nm. The chromatographic separation was achieved in 8 minutes on a Waters XBridge C18 column (250 mm x 4.6 mm, 5µm maintained at ambient temperature. The proposed method proved to be specific and robust for the quality control of saxagliptin in pharmaceutical dosage forms, showing good linearity in the range of 15.0 - 100.0 µg.mL-1 (r>0.999, precision (RSD

  2. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  3. A quantitative discussion on the assessment of power supply technologies: DEA (data envelopment analysis) and SAW (simple additive weighting) as complementary methods for the “Grammar”

    International Nuclear Information System (INIS)

    Shakouri, Hamed G.; Nabaee, Mahdis; Aliakbarisani, Sajad

    2014-01-01

    The growing concern about the negative effects of fossil fuels on the environment, and the limited resources of them have forced more intensive use of other energy sources. In absence of sufficient economically feasible renewable energies, nuclear power may play essential role in this field. Recently, the advantages and disadvantages of nuclear power and fossil fuels regarding their efficiencies have been attracted researchers' interest. This paper discusses on the findings from “A Grammar for assessing the performance of power supply systems: comparing nuclear energy to fossil energy” (Diaz-Maurin F, Giampietro M. 2013). Although the “Grammar” is a very valuable approach, it can be accomplished by using helpful quantitative methods. In this discussion, we apply quantitative decision-making approaches to compare the same fossil fuel (coal) power plants with nuclear power plants. Economic variables are also taken into consideration. The DEA (data envelopment analysis) and SAW (simple additive weighting) are the methods applied. Results confirm the results of the reference paper in most cases and show that the fossil fuel power plants with CCS (carbon capture and storage) are slightly more efficient than nuclear power plants. However, selection of input and output variables is disputable. Assuming job creation as a desired output can change the ranking results. - Highlights: • A numeric decision making approach is proposed to facilitate assessment of technologies introduced as a “Grammar”. • Two different methods are chosen (SAW and DEA) since the results obtained for ranking may differ with different methods. • We propose to use the original fractional objective function of DEA with equal weightings applied for the attributes. • Proper combinations of both input and output attributes including economic variables are discussed. • The attributes for assessment of the technologies differ from different viewpoints. Labor force and costs are simple

  4. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  5. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  6. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  7. Quantitative analysis of burden of infectious diarrhea associated with floods in northwest of anhui province, china: a mixed method evaluation.

    Science.gov (United States)

    Ding, Guoyong; Zhang, Ying; Gao, Lu; Ma, Wei; Li, Xiujun; Liu, Jing; Liu, Qiyong; Jiang, Baofa

    2013-01-01

    Persistent and heavy rainfall in the upper and middle Huaihe River of China brought about severe floods during the end of June and July 2007. However, there has been no assessment on the association between the floods and infectious diarrhea. This study aimed to quantify the impact of the floods in 2007 on the burden of disease due to infectious diarrhea in northwest of Anhui Province. A time-stratified case-crossover analysis was firstly conducted to examine the relationship between daily cases of infectious diarrhea and the 2007 floods in Fuyang and Bozhou of Anhui Province. Odds ratios (ORs) of the flood risk were quantified by conditional logistic regression. The years lived with disability (YLDs) of infectious diarrhea attributable to floods were then estimated based on the WHO framework of the calculating potential impact fraction in the Burden of Disease study. A total of 197 infectious diarrheas were notified during the exposure and control periods in the two study areas. The strongest effect was shown with a 2-day lag in Fuyang and a 5-day lag in Bozhou. Multivariable analysis showed that floods were significantly associated with an increased risk of the number cases of infectious diarrhea (OR = 3.175, 95%CI: 1.126-8.954 in Fuyang; OR = 6.754, 95%CI: 1.954-23.344 in Bozhou). Attributable YLD per 1000 of infectious diarrhea resulting from the floods was 0.0081 in Fuyang and 0.0209 in Bozhou. Our findings confirm that floods have significantly increased the risks of infectious diarrhea in the study areas. In addition, prolonged moderate flood may cause more burdens of infectious diarrheas than severe flood with a shorter duration. More attention should be paid to particular vulnerable groups, including younger children and elderly, in developing public health preparation and intervention programs. Findings have significant implications for developing strategies to prevent and reduce health impact of floods.

  8. A sensitive and selective liquid chromatography/tandem mass spectrometry method for quantitative analysis of efavirenz in human plasma.

    Directory of Open Access Journals (Sweden)

    Praveen Srivastava

    Full Text Available A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI, efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC-MS/MS. Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and ¹³C₆-efavirenz (Internal Standard, respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (¹³C₆-efavirenz and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99 over the concentration range of 1.0-2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ was 9.24% and for quality control (QC samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100-111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03-9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2-108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients.

  9. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  10. Application of bias correction methods to improve U3Si2 sample preparation for quantitative analysis by WDXRF

    International Nuclear Information System (INIS)

    Scapin, Marcos A.; Guilhen, Sabine N.; Azevedo, Luciana C. de; Cotrim, Marycel E.B.; Pires, Maria Ap. F.

    2017-01-01

    The determination of silicon (Si), total uranium (U) and impurities in uranium-silicide (U 3 Si 2 ) samples by wavelength dispersion X-ray fluorescence technique (WDXRF) has been already validated and is currently implemented at IPEN's X-Ray Fluorescence Laboratory (IPEN-CNEN/SP) in São Paulo, Brazil. Sample preparation requires the use of approximately 3 g of H 3 BO 3 as sample holder and 1.8 g of U 3 Si 2 . However, because boron is a neutron absorber, this procedure precludes U 3 Si 2 sample's recovery, which, in time, considering routinely analysis, may account for significant unusable uranium waste. An estimated average of 15 samples per month are expected to be analyzed by WDXRF, resulting in approx. 320 g of U 3 Si 2 that would not return to the nuclear fuel cycle. This not only impacts in production losses, but generates another problem: radioactive waste management. The purpose of this paper is to present the mathematical models that may be applied for the correction of systematic errors when H 3 BO 3 sample holder is substituted by cellulose-acetate {[C 6 H 7 O 2 (OH) 3-m (OOCCH 3 )m], m = 0∼3}, thus enabling U 3 Si 2 sample’s recovery. The results demonstrate that the adopted mathematical model is statistically satisfactory, allowing the optimization of the procedure. (author)

  11. Influence of diopside: feldspar ratio in ceramic reactions assessed by quantitative phase analysis (X-ray diffraction - Rietveld method)

    International Nuclear Information System (INIS)

    Kuzmickas, L.; Andrade, F.R.D.; Szabo, G.A.J.; Motta, J.F.M.; Cabral Junior, M.

    2013-01-01

    White ceramics were produced with raw mixtures prepared with varying proportions of diopside-rich rock (0 to 20 wt.%) and potassic feldspar (40 to 20 wt.%), and fixed proportions of kaolinite (40 wt.%) and quartz (20 wt.%), fired in a temperature range from 1170 to 1210 deg C. The phases identified in the experimental ceramics were quartz, anorthite, mullite and glass, and their relative mass proportions were determined by X-ray diffraction (Rietveld method). The addition of diopside as a partial substitute for potassic feldspar causes the formation of a calcium silicate, analogous of the natural anorthite (CaSi 2 Al 2 O 8 ) in the ceramics, with proportional reduction in its glass and mullite contents. Water absorption and porosity of the ceramic bodies clearly decrease with increasing firing temperature, while the effect of the raw mixture composition on the physical and mechanical properties of the ceramics is less evident. Diopside-rich rock has low iron content (1.5 wt.% Fe 2 O 3 ) and, therefore, promotes white burning. (author)

  12. Influence of diopside: feldspar ratio in ceramic reactions assessed by quantitative phase analysis (X-ray diffraction - Rietveld method)

    Energy Technology Data Exchange (ETDEWEB)

    Kuzmickas, L.; Andrade, F.R.D.; Szabo, G.A.J. [Universidade de Sao Paulo (IGc/USP), SP (Brazil). Inst. de Geociencias. Dept. de Mineralogia e Geotecnia; Motta, J.F.M.; Cabral Junior, M., E-mail: lukuzmickas@gmail.com, E-mail: dias@usp.br, E-mail: gajszabo@usp.b, E-mail: motta.jf@gmail.com, E-mail: marsis@ipt.br [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil). Secao de Recursos Minerais e Tecnologia Ceramica

    2013-04-15

    White ceramics were produced with raw mixtures prepared with varying proportions of diopside-rich rock (0 to 20 wt.%) and potassic feldspar (40 to 20 wt.%), and fixed proportions of kaolinite (40 wt.%) and quartz (20 wt.%), fired in a temperature range from 1170 to 1210 deg C. The phases identified in the experimental ceramics were quartz, anorthite, mullite and glass, and their relative mass proportions were determined by X-ray diffraction (Rietveld method). The addition of diopside as a partial substitute for potassic feldspar causes the formation of a calcium silicate, analogous of the natural anorthite (CaSi{sub 2}Al{sub 2}O{sub 8}) in the ceramics, with proportional reduction in its glass and mullite contents. Water absorption and porosity of the ceramic bodies clearly decrease with increasing firing temperature, while the effect of the raw mixture composition on the physical and mechanical properties of the ceramics is less evident. Diopside-rich rock has low iron content (1.5 wt.% Fe{sub 2}O{sub 3}) and, therefore, promotes white burning. (author)

  13. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  14. Freeze-substitution methods for Ni localization and quantitative analysis in Berkheya coddii leaves by means of PIXE

    International Nuclear Information System (INIS)

    Budka, D.; Mesjasz-PrzybyIowicz, J.; Tylko, G.; PrzybyIowicz, W.J.

    2005-01-01

    Leaves of Ni hyperaccumulator Berkheya coddii were chosen as a model to investigate the influence of eight freeze-substitution protocols on the Ni content and distribution. Freeze-substitution of leaf samples cryofixed by high-pressure freezing was carried out in dry acetone, methanol, diethyl ether and tetrahydrofuran. The same substitution media were also used with dimethylglyoxime added as a precipitation reagent. The samples were infiltrated and embedded in Spurr's resin. Micro-PIXE analysis of Ni concentration and localization, complemented by proton backscattering for matrix assessment, was performed using the nuclear microprobe at Materials Research Group, iThemba LABS, South Africa. True elemental maps and concentrations were obtained using GeoPIXE-II software. The results were compared with the control results obtained for the parallel air-dried samples, corrected for the water content. The highest Ni content was found in the leaf samples substituted in diethyl ether. This concentration was statistically different from the results obtained for other media. In case of diethyl ether medium Ni was mainly localized in the mesophyll tissue, and the distribution map of this element was in accordance with previous results obtained for freeze-dried and frozen-hydrated leaves of this species. The same distribution pattern was observed for specimens embedded in dry acetone, but Ni concentration was significantly lower. Tetrahydrofuran medium preserved Ni preferentially in the epidermis and vascular tissue, and the elemental map for samples embedded in this medium was distorted. Ni was almost completely washed out from samples substituted in methanol and it was thus impossible to obtain a picture of its distribution. Dimethylglyoxime did not improve the preservation of this element. These results show that diethyl ether is a suitable substitution medium for assessment of Ni concentration and distribution in leaves of B. coddii

  15. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  16. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  17. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    International Nuclear Information System (INIS)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio; Fukusaki, Eiichiro

    2009-01-01

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R 2 values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  18. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: fukusaki@bio.eng.osaka-u.ac.jp [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-08-26

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  19. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  20. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  1. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  2. A quantitative method to the analysis of MLC leaf position and speed based on EPID and EBT3 film for dynamic IMRT treatment with different types of MLC.

    Science.gov (United States)

    Li, Yinghui; Chen, Lixin; Zhu, Jinhan; Wang, Bin; Liu, Xiaowei

    2017-07-01

    A quantitative method based on the electronic portal imaging system (EPID) and film was developed for MLC position and speed testing; this method was used for three MLC types (Millennium, MLCi, and Agility MLC). To determine the leaf position, a picket fence designed by the dynamic (DMLC) model was used. The full-width half-maximum (FWHM) values of each gap measured by EPID and EBT3 were converted to the gap width using the FWHM versus nominal gap width relationship. The algorithm developed for the picket fence analysis was able to quantify the gap width, the distance between gaps, and each individual leaf position. To determine the leaf speed, a 0.5 × 20 cm 2 MLC-defined sliding gap was applied across a 14 × 20 cm 2 symmetry field. The linacs ran at a fixed-dose rate. The use of different monitor units (MUs) for this test led to different leaf speeds. The effect of leaf transmission was considered in a speed accuracy analysis. The difference between the EPID and film results for the MLC position is less than 0.1 mm. For the three MLC types, twice the standard deviation (2 SD) is provided; 0.2, 0.4, and 0.4 mm for gap widths of three MLC types, and 0.1, 0.2, and 0.2 mm for distances between gaps. The individual leaf positions deviate from the preset positions within 0.1 mm. The variations in the speed profiles for the EPID and EBT3 results are consistent, but the EPID results are slightly better than the film results. Different speeds were measured for each MLC type. For all three MLC types, speed errors increase with increasing speed. The analysis speeds deviate from the preset speeds within approximately 0.01 cm s -1 . This quantitative analysis of MLC position and speed provides an intuitive evaluation for MLC quality assurance (QA). © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  3. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  4. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  5. Hydrophilic interaction liquid chromatography-tandem mass spectrometry quantitative method for the cellular analysis of varying structures of gemini surfactants designed as nanomaterial drug carriers.

    Science.gov (United States)

    Donkuru, McDonald; Michel, Deborah; Awad, Hanan; Katselis, George; El-Aneed, Anas

    2016-05-13

    Diquaternary gemini surfactants have successfully been used to form lipid-based nanoparticles that are able to compact, protect, and deliver genetic materials into cells. However, what happens to the gemini surfactants after they have released their therapeutic cargo is unknown. Such knowledge is critical to assess the quality, safety, and efficacy of gemini surfactant nanoparticles. We have developed a simple and rapid liquid chromatography electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method for the quantitative determination of various structures of gemini surfactants in cells. Hydrophilic interaction liquid chromatography (HILIC) was employed allowing for a short simple isocratic run of only 4min. The lower limit of detection (LLOD) was 3ng/mL. The method was valid to 18 structures of gemini surfactants belonging to two different structural families. A full method validation was performed for two lead compounds according to USFDA guidelines. The HILIC-MS/MS method was compatible with the physicochemical properties of gemini surfactants that bear a permanent positive charge with both hydrophilic and hydrophobic elements within their molecular structure. In addition, an effective liquid-liquid extraction method (98% recovery) was employed surpassing previously used extraction methods. The analysis of nanoparticle-treated cells showed an initial rise in the analyte intracellular concentration followed by a maximum and a somewhat more gradual decrease of the intracellular concentration. The observed intracellular depletion of the gemini surfactants may be attributable to their bio-transformation into metabolites and exocytosis from the host cells. Obtained cellular data showed a pattern that grants additional investigations, evaluating metabolite formation and assessing the subcellular distribution of tested compounds. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  7. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  8. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  9. Quantitative Methods for Software Selection and Evaluation

    National Research Council Canada - National Science Library

    Bandor, Michael S

    2006-01-01

    ... (the ability of the product to meet the need) and the cost. The method used for the analysis and selection activities can range from the use of basic intuition to counting the number of requirements fulfilled, or something...

  10. Simultaneous determination of linagliptin and metformin by reverse phase-high performance liquid chromatography method: An application in quantitative analysis of pharmaceutical dosage forms

    Directory of Open Access Journals (Sweden)

    Prathyusha Vemula

    2015-01-01

    Full Text Available To enhance patient compliance toward treatment in diseases like diabetes, usually a combination of drugs is prescribed. Therefore, an anti-diabetic fixed-dose combination of 2.5 mg of linagliptin 500 mg of metformin was taken for simultaneous estimation of both the drugs by reverse phase-high performance liquid chromatography (RP-HPLC method. The present study aimed to develop a simple and sensitive RP-HPLC method for the simultaneous determination of linagliptin and metformin in pharmaceutical dosage forms. The chromatographic separation was designed and evaluated by using linagliptin and metformin working standard and sample solutions in the linearity range. Chromatographic separation was performed on a C 18 column using a mobile phase of 70:30 (v/v mixture of methanol and 0.05 M potassium dihydrogen orthophosphate (pH adjusted to 4.6 with orthophosphoric acid delivered at a flow rate of 0.6 mL/min and UV detection at 267 nm. Linagliptin and metformin shown linearity in the range of 2-12 μg/mL and 400-2400 μg/mL respectively with correlation co-efficient of 0.9996 and 0.9989. The resultant findings analyzed for standard deviation (SD and relative standard deviation to validate the developed method. The retention time of linagliptin and metformin was found to be 6.3 and 4.6 min and separation was complete in <10 min. The method was validated for linearity, accuracy and precision were found to be acceptable over the linearity range of the linagliptin and metformin. The method was found suitable for the routine quantitative analysis of linagliptin and metformin in pharmaceutical dosage forms.

  11. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    Science.gov (United States)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  12. [A new method of processing quantitative PCR data].

    Science.gov (United States)

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  13. Quantitative mineral salt evaluation in the calcaneous bone using computed tomography, 125I-photon absorption and chemical analysis to compare the value of the individual methods

    International Nuclear Information System (INIS)

    Hitzler, H.J.

    1983-01-01

    It was the aim of the study described here to verify the accuracy of two different methods for the quantitative evaluation of mineral salts, which were the 125I-photon absorption technique on the one hand and wholebody CT on the other hand. For this purpose, post-mortem examinations of 31 calcaneous bones were carried out to evaluate their individual mineral salt contents in vitro using either of the above-mentioned methods. The results obtained were subsequently contrasted with calcium concentrations determined by chemical analysis. A comparison of the individual mineral salt evaluations with the results from calcium analyses pointed to a highly significant correlation (p=0.001) for both methods under investigation. The same held for the correlation of findings from CT and the 125I-hydroxylapatite technique, where the level of significance was also p=0.001. The above statements must, however, be modified in as much as the mineral salt values measured by CT were consistently lower than those obtained on the basis of 125I-photon absorption. These deviations are chiefly attributable to the fact that the values provided by CT are more susceptible to influences from the fat contained in the bones. In 125I-photon absorption a special formula may be derived to allow for the bias occurring here, provided that the composition of the bone is known. To summarise, the relative advantages and drawbacks of CT and 125I-photon absorption are carefully balanced. Mineral salt evaluations by CT permit incipient losses to be ascertained even in the trunk. The 125I-photon absorption technique would appear to be the obvious method for any kind of follow-up examination in the peripheral skeleton, as it is easily reproducible and radiation exposure can be kept to minimum. (TRV) [de

  14. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  15. The investigations on the use of proton magnetic resonance in the quantitative analysis of organis compounds. Part 1. Review of analytical methods

    International Nuclear Information System (INIS)

    Ciercierska-Stoklosa, D.

    1976-01-01

    The review of papers on the application of proton magnetic resonance in quantitative analyisis of organic compounds in multicomponent mixtures has been presented. The applied techniques have been described and unified way of presentation of formulas used in calculations has been propsed. The information about precision, accuracy and detectability of the method and the possibility of improving of determination has been given. (author)

  16. Investigation of a Quantitative Method for the Analysis of Chiral Monoterpenes in White Wine by HS-SPME-MDGC-MS of Different Wine Matrices

    Directory of Open Access Journals (Sweden)

    Mei Song

    2015-04-01

    Full Text Available A valid quantitative method for the analysis of chiral monoterpenes in white wine using head-space solid phase micro-extraction-MDGC-MS (HS-SPME-MDGC-MS with stable isotope dilution analysis was established. Fifteen compounds: (S-(−-limonene, (R-(+-limonene, (+-(2R,4S-cis-rose oxide, (−-(2S,4R-cis-rose oxide, (−-(2R,4R-trans-rose oxide, (+-(2S,4S-cis-rose oxide, furanoid (+-trans-linalool oxide, furanoid (−-cis-linalool oxide, furanoid (−-trans-linalool oxide, furanoid (+-cis-linalool oxide, (−-linalool, (+-linalool, (−-α-terpineol, (+-α-terpineol and (R-(+-β-citronellol were quantified. Two calibration curves were plotted for different wine bases, with varying residual sugar content, and three calibration curves for each wine base were investigated during a single fiber’s lifetime. This was needed as both sugar content and fiber life impacted the quantification of the chiral terpenes. The chiral monoterpene content of six Pinot Gris wines and six Riesling wines was then analyzed using the verified method. ANOVA with Tukey multiple comparisons showed significant differences for each of the detected chiral compounds in all 12 wines. PCA score plots showed a clear separation between the Riesling and Pinot Gris wines. Riesling wines had greater number of chiral terpenes in comparison to Pinot Gris wines. Beyond total terpene content it is possible that the differences in chiral terpene content may be driving the aromatic differences in white wines.

  17. Structural Analysis of PTM Hotspots (SAPH-ire)--A Quantitative Informatics Method Enabling the Discovery of Novel Regulatory Elements in Protein Families.

    Science.gov (United States)

    Dewhurst, Henry M; Choudhury, Shilpa; Torres, Matthew P

    2015-08-01

    Predicting the biological function potential of post-translational modifications (PTMs) is becoming increasingly important in light of the exponential increase in available PTM data from high-throughput proteomics. We developed structural analysis of PTM hotspots (SAPH-ire)--a quantitative PTM ranking method that integrates experimental PTM observations, sequence conservation, protein structure, and interaction data to allow rank order comparisons within or between protein families. Here, we applied SAPH-ire to the study of PTMs in diverse G protein families, a conserved and ubiquitous class of proteins essential for maintenance of intracellular structure (tubulins) and signal transduction (large and small Ras-like G proteins). A total of 1728 experimentally verified PTMs from eight unique G protein families were clustered into 451 unique hotspots, 51 of which have a known and cited biological function or response. Using customized software, the hotspots were analyzed in the context of 598 unique protein structures. By comparing distributions of hotspots with known versus unknown function, we show that SAPH-ire analysis is predictive for PTM biological function. Notably, SAPH-ire revealed high-ranking hotspots for which a functional impact has not yet been determined, including phosphorylation hotspots in the N-terminal tails of G protein gamma subunits--conserved protein structures never before reported as regulators of G protein coupled receptor signaling. To validate this prediction we used the yeast model system for G protein coupled receptor signaling, revealing that gamma subunit-N-terminal tail phosphorylation is activated in response to G protein coupled receptor stimulation and regulates protein stability in vivo. These results demonstrate the utility of integrating protein structural and sequence features into PTM prioritization schemes that can improve the analysis and functional power of modification-specific proteomics data. © 2015 by The American

  18. Structural Analysis of PTM Hotspots (SAPH-ire) – A Quantitative Informatics Method Enabling the Discovery of Novel Regulatory Elements in Protein Families*

    Science.gov (United States)

    Dewhurst, Henry M.; Choudhury, Shilpa; Torres, Matthew P.

    2015-01-01

    Predicting the biological function potential of post-translational modifications (PTMs) is becoming increasingly important in light of the exponential increase in available PTM data from high-throughput proteomics. We developed structural analysis of PTM hotspots (SAPH-ire)—a quantitative PTM ranking method that integrates experimental PTM observations, sequence conservation, protein structure, and interaction data to allow rank order comparisons within or between protein families. Here, we applied SAPH-ire to the study of PTMs in diverse G protein families, a conserved and ubiquitous class of proteins essential for maintenance of intracellular structure (tubulins) and signal transduction (large and small Ras-like G proteins). A total of 1728 experimentally verified PTMs from eight unique G protein families were clustered into 451 unique hotspots, 51 of which have a known and cited biological function or response. Using customized software, the hotspots were analyzed in the context of 598 unique protein structures. By comparing distributions of hotspots with known versus unknown function, we show that SAPH-ire analysis is predictive for PTM biological function. Notably, SAPH-ire revealed high-ranking hotspots for which a functional impact has not yet been determined, including phosphorylation hotspots in the N-terminal tails of G protein gamma subunits—conserved protein structures never before reported as regulators of G protein coupled receptor signaling. To validate this prediction we used the yeast model system for G protein coupled receptor signaling, revealing that gamma subunit–N-terminal tail phosphorylation is activated in response to G protein coupled receptor stimulation and regulates protein stability in vivo. These results demonstrate the utility of integrating protein structural and sequence features into PTM prioritization schemes that can improve the analysis and functional power of modification-specific proteomics data. PMID:26070665

  19. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  20. Industrial ecology: Quantitative methods for exploring a lower carbon future

    Science.gov (United States)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  1. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  3. Box-Counting Method of 2D Neuronal Image: Method Modification and Quantitative Analysis Demonstrated on Images from the Monkey and Human Brain

    Directory of Open Access Journals (Sweden)

    Nemanja Rajković

    2017-01-01

    Full Text Available This study calls attention to the difference between traditional box-counting method and its modification. The appropriate scaling factor, influence on image size and resolution, and image rotation, as well as different image presentation, are showed on the sample of asymmetrical neurons from the monkey dentate nucleus. The standard BC method and its modification were evaluated on the sample of 2D neuronal images from the human neostriatum. In addition, three box dimensions (which estimate the space-filling property, the shape, complexity, and the irregularity of dendritic tree were used to evaluate differences in the morphology of type III aspiny neurons between two parts of the neostriatum.

  4. Box-Counting Method of 2D Neuronal Image: Method Modification and Quantitative Analysis Demonstrated on Images from the Monkey and Human Brain.

    Science.gov (United States)

    Rajković, Nemanja; Krstonošić, Bojana; Milošević, Nebojša

    2017-01-01

    This study calls attention to the difference between traditional box-counting method and its modification. The appropriate scaling factor, influence on image size and resolution, and image rotation, as well as different image presentation, are showed on the sample of asymmetrical neurons from the monkey dentate nucleus. The standard BC method and its modification were evaluated on the sample of 2D neuronal images from the human neostriatum. In addition, three box dimensions (which estimate the space-filling property, the shape, complexity, and the irregularity of dendritic tree) were used to evaluate differences in the morphology of type III aspiny neurons between two parts of the neostriatum.

  5. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  6. Factor Analysis, AMMI Stability Value (ASV Parameter and GGE Bi-Plot Graphical Method of Quantitative and Qualitative Traits in Potato Genotypes

    Directory of Open Access Journals (Sweden)

    Davood Hassanpanah

    2016-10-01

    Full Text Available Quantitative and qualitative traits and stability of marketable tuber yield of 14 promising potato clones, along with three commercial cultivars (Agria, Marfona and Savalan as checks, were evaluated at the Ardabil Agricultural and Natural Resources Research Station during 2013 and 2014. The experiment was based on a randomized complete block design with four replications. During growing period and after harvest, traits like main stem number per plant, plant height, tuber number and weight per plant, total and marketable tuber yield, dry matter percentage, baking type, hollow heart, tuber inner ring and discoloration of raw tuber flesh after 24 hours were measured. Combined ANONA for quantitative traits showed that there were significant differences among promising clones as to total and marketable tuber yield, tuber number and weight per plant, plant height, tuber mean weight, main stem number per plant and dry matter percentage and their interactions with year in total and marketable tuber yield. The clone 9 (397078-3 with the least amount of marketable tuber yield had significant difference with clones 4 (397045-13, 1 (397031-16, 3 (397031-11, 6 (397009-8 and 12 (397067-6 in 2013 and with clone 4 (397045-13 and Agria cultivar in 2014. The clones 4(397045-13, 1 (397031-16 and 12 (397067-6 had uniform tuber, yellow to dark-yellow skin and light-yellow to yellow flesh color, tuber shape of oval round and round, shallow to mid shallow eyes, no tuber inner ring, hollow heart and tuber inner crack and mid-late maturity. They were selected for home consumption of chips, french-fries and frying. Based on the results of factor analysis, "tuber yield", "number of tuber" and "plant structural and quality "were named as first, second and third quality determining factors respectively. In this experiment, GGE Bi-plot model and AMMI Stability Value (ASV parameter, were acceptable methods for the selection of marketable tuber yield stability which found to

  7. Quantitative method of measuring cancer cell urokinase and metastatic potential

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  8. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  9. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  10. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  11. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  12. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  13. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  14. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  15. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  16. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  17. A Novel HPLC Method for the Concurrent Analysis and Quantitation of Seven Water-Soluble Vitamins in Biological Fluids (Plasma and Urine: A Validation Study and Application

    Directory of Open Access Journals (Sweden)

    Margherita Grotzkyj Giorgi

    2012-01-01

    Full Text Available An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B1, B2, B5, B6, B9, B12 in biological matrices (plasma and urine. Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males. Interday and intraday precision were <4% and <7%, respectively, for all vitamins. Recovery percentages ranged from 93% to 100%.

  18. A novel HPLC method for the concurrent analysis and quantitation of seven water-soluble vitamins in biological fluids (plasma and urine): a validation study and application.

    Science.gov (United States)

    Giorgi, Margherita Grotzkyj; Howland, Kevin; Martin, Colin; Bonner, Adrian B

    2012-01-01

    An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B(1), B(2), B(5), B(6), B(9), B(12)) in biological matrices (plasma and urine). Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males). Interday and intraday precision were vitamins. Recovery percentages ranged from 93% to 100%.

  19. Comparison of 3D quantitative structure-activity relationship methods: Analysis of the in vitro antimalarial activity of 154 artemisinin analogues by hypothetical active-site lattice and comparative molecular field analysis

    Science.gov (United States)

    Woolfrey, John R.; Avery, Mitchell A.; Doweyko, Arthur M.

    1998-03-01

    Two three-dimensional quantitative structure-activity relationship (3D-QSAR) methods, comparative molecular field analysis (CoMFA) and hypothetical active site lattice (HASL), were compared with respect to the analysis of a training set of 154 artemisinin analogues. Five models were created, including a complete HASL and two trimmed versions, as well as two CoMFA models (leave-one-out standard CoMFA and the guided-region selection protocol). Similar r2 and q2 values were obtained by each method, although some striking differences existed between CoMFA contour maps and the HASL output. Each of the four predictive models exhibited a similar ability to predict the activity of a test set of 23 artemisinin analogues, although some differences were noted as to which compounds were described well by either model.

  20. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  1. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  2. Calibration of quantitative neutron radiography method for moisture measurement

    International Nuclear Information System (INIS)

    Nemec, T.; Jeraj, R.

    1999-01-01

    Quantitative measurements of moisture and hydrogenous matter in building materials by neutron radiography (NR) are regularly performed at TRIGA Mark II research of 'Jozef Stefan' Institute in Ljubljana. Calibration of quantitative method is performed using standard brick samples with known moisture content and also with a secondary standard, plexiglas step wedge. In general, the contribution of scattered neutrons to the neutron image is not determined explicitly what introduces an error to the measured signal. Influence of scattered neutrons is significant in regions with high gradients of moisture concentrations, where the build up of scattered neutrons causes distortion of the moisture concentration profile. In this paper detailed analysis of validity of our calibration method for different geometrical parameters is presented. The error in the measured hydrogen concentration is evaluated by an experiment and compared with results obtained by Monte Carlo calculation with computer code MCNP 4B. Optimal conditions are determined for quantitative moisture measurements in order to minimize the error due to scattered neutrons. The method is tested on concrete samples with high moisture content.(author)

  3. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  4. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  5. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  6. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    International Nuclear Information System (INIS)

    Nakanaga, Kazue; Maeda Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko

    1999-01-01

    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, 33 P of 35 P, of which degradation energy is less that 32 P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  7. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    Energy Technology Data Exchange (ETDEWEB)

    Nakanaga, Kazue; Maeda Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko [National Inst. of Infectious Diseases, Tokyo (Japan)

    1999-02-01

    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, {sup 33}P of {sup 35}P, of which degradation energy is less that {sup 32}P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  8. Standardization of a sulfur quantitative analysis method by X ray fluorescence in a leaching solution for bio-available sulfates in soil

    International Nuclear Information System (INIS)

    Morales S, E.; Aguilar S, E.

    1989-11-01

    A method for bio-available sulfate analysis in soils is described. A Ca(H2PO4) leaching solution was used for soil samples treatment. A standard NaSO4 solution was used for preparing a calibration curve and also the fundamental parameters method approach was employed. An Am-241 (100 mCi) source and a Si-Li detector were employed. Analysis could be done in 5 minutes; good reproducibility, 5 and accuracy, 5 were obtained. The method is very competitive with conventional nephelometry where good and reproducible suspensions are difficult to obtain. (author)

  9. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    Science.gov (United States)

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  10. Unrecorded Alcohol Consumption: Quantitative Methods of Estimation

    OpenAIRE

    Razvodovsky, Y. E.

    2010-01-01

    unrecorded alcohol; methods of estimation In this paper we focused on methods of estimation of unrecorded alcohol consumption level. Present methods of estimation of unrevorded alcohol consumption allow only approximate estimation of unrecorded alcohol consumption level. Tacking into consideration the extreme importance of such kind of data, further investigation is necessary to improve the reliability of methods estimation of unrecorded alcohol consumption.

  11. Informatics methods to enable sharing of quantitative imaging research data.

    Science.gov (United States)

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  13. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  14. [Cloning and sequence analysis of the DHBV genome of the brown ducks in Guilin region and establishment of the quantitative method for detecting DHBV].

    Science.gov (United States)

    Su, He-Ling; Huang, Ri-Dong; He, Song-Qing; Xu, Qing; Zhu, Hua; Mo, Zhi-Jing; Liu, Qing-Bo; Liu, Yong-Ming

    2013-03-01

    Brown ducks carrying DHBV were widely used as hepatitis B animal model in the research of the activity and toxicity of anti-HBV dugs. Studies showed that the ratio of DHBV carriers in the brown ducks in Guilin region was relatively high. Nevertheless, the characters of the DHBV genome of Guilin brown duck remain unknown. Here we report the cloning of the genome of Guilin brown duck DHBV and the sequence analysis of the genome. The full length of the DHBV genome of Guilin brown duck was 3 027bp. Analysis using ORF finder found that there was an ORF for an unknown peptide other than S-ORF, PORF and C-ORF in the genome of the DHBV. Vector NTI 8. 0 analysis revealed that the unknown peptide contained a motif which binded to HLA * 0201. Aligning with the DHBV sequences from different countries and regions indicated that there were no obvious differences of regional distribution among the sequences. A fluorescence quantitative PCR for detecting DHBV was establishment based on the recombinant plasmid pGEM-DHBV-S constructed. This study laid the groundwork for using Guilin brown duck as a hepatitis B animal model.

  15. A quantitative method to track protein translocation between intracellular compartments in real-time in live cells using weighted local variance image analysis.

    Directory of Open Access Journals (Sweden)

    Guillaume Calmettes

    Full Text Available The genetic expression of cloned fluorescent proteins coupled to time-lapse fluorescence microscopy has opened the door to the direct visualization of a wide range of molecular interactions in living cells. In particular, the dynamic translocation of proteins can now be explored in real time at the single-cell level. Here we propose a reliable, easy-to-implement, quantitative image processing method to assess protein translocation in living cells based on the computation of spatial variance maps of time-lapse images. The method is first illustrated and validated on simulated images of a fluorescently-labeled protein translocating from mitochondria to cytoplasm, and then applied to experimental data obtained with fluorescently-labeled hexokinase 2 in different cell types imaged by regular or confocal microscopy. The method was found to be robust with respect to cell morphology changes and mitochondrial dynamics (fusion, fission, movement during the time-lapse imaging. Its ease of implementation should facilitate its application to a broad spectrum of time-lapse imaging studies.

  16. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  17. The method of quantitative X-ray microanalysis of fine inclusions in copper

    International Nuclear Information System (INIS)

    Morawiec, H.; Kubica, L.; Piszczek, J.

    1978-01-01

    The method of correction for the matrix effect in quantitative x-ray microanalysis was presented. The application of the method was discussed on the example of quantitative analysis of fine inclusions of Cu 2 S and Cu 2 O in copper. (author)

  18. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  19. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  20. Practical application of qualitative and quantitative methods

    CSIR Research Space (South Africa)

    Funke, Nicola S

    2017-12-01

    Full Text Available making process. • CBA can be applied to a myriad of socio- economic decisions, public and/or private sphere. 15 Cost-Benefit-Analysis (CBA) • Direct cost & benefits • Indirect effects • Third parties effects • Social adjustments – Social prices... Considerations from perspective Private Public 16 Cost-Benefit-Analysis (CBA) Preparation for group exercise – Case study Community adoption of drip irrigation technology 17 Cost-Benefit-Analysis (CBA) • Direct cost & benefits – Drip irrigation system...

  1. A Quantitative Method for the Analysis of Nomothetic Relationships between Idiographic Structures: Dynamic Patterns Create Attractor States for Sustained Posttreatment Change

    Science.gov (United States)

    Fisher, Aaron J.; Newman, Michelle G.; Molenaar, Peter C. M.

    2011-01-01

    Objective: The present article aimed to demonstrate that the establishment of dynamic patterns during the course of psychotherapy can create attractor states for continued adaptive change following the conclusion of treatment. Method: This study is a secondary analysis of T. D. Borkovec and E. Costello (1993). Of the 55 participants in the…

  2. Quantitative methods for studying design protocols

    CERN Document Server

    Kan, Jeff WT

    2017-01-01

    This book is aimed at researchers and students who would like to engage in and deepen their understanding of design cognition research. The book presents new approaches for analyzing design thinking and proposes methods of measuring design processes. These methods seek to quantify design issues and design processes that are defined based on notions from the Function-Behavior-Structure (FBS) design ontology and from linkography. A linkograph is a network of linked design moves or segments. FBS ontology concepts have been used in both design theory and design thinking research and have yielded numerous results. Linkography is one of the most influential and elegant design cognition research methods. In this book Kan and Gero provide novel and state-of-the-art methods of analyzing design protocols that offer insights into design cognition by integrating segmentation with linkography by assigning FBS-based codes to design moves or segments and treating links as FBS transformation processes. They propose and test ...

  3. Quantitative Analysis of First-Pass Contrast-Enhanced Myocardial Perfusion Multidetector CT Using a Patlak Plot Method and Extraction Fraction Correction During Adenosine Stress

    Science.gov (United States)

    Ichihara, Takashi; George, Richard T.; Silva, Caterina; Lima, Joao A. C.; Lardo, Albert C.

    2011-02-01

    The purpose of this study was to develop a quantitative method for myocardial blood flow (MBF) measurement that can be used to derive accurate myocardial perfusion measurements from dynamic multidetector computed tomography (MDCT) images by using a compartment model for calculating the first-order transfer constant (K1) with correction for the capillary transit extraction fraction (E). Six canine models of left anterior descending (LAD) artery stenosis were prepared and underwent first-pass contrast-enhanced MDCT perfusion imaging during adenosine infusion (0.14-0.21 mg/kg/min). K1 , which is the first-order transfer constant from left ventricular (LV) blood to myocardium, was measured using the Patlak plot method applied to time-attenuation curve data of the LV blood pool and myocardium. The results were compared against microsphere MBF measurements, and the extraction fraction of contrast agent was calculated. K1 is related to the regional MBF as K1=EF, E=(1-exp(-PS/F)), where PS is the permeability-surface area product and F is myocardial flow. Based on the above relationship, a look-up table from K1 to MBF can be generated and Patlak plot-derived K1 values can be converted to the calculated MBF. The calculated MBF and microsphere MBF showed a strong linear association. The extraction fraction in dogs as a function of flow (F) was E=(1-exp(-(0.2532F+0.7871)/F)) . Regional MBF can be measured accurately using the Patlak plot method based on a compartment model and look-up table with extraction fraction correction from K1 to MBF.

  4. Quantitative method for determination of body inorganic iodine

    International Nuclear Information System (INIS)

    Filatov, A.A.; Tatsievskij, V.A.

    1991-01-01

    An original method of quantitation of body inorganic iodine, based upon a simultaneous administration of a known dose of stable and radioactive iodine with subsequent radiometry of the thyroid was proposed. The calculation is based upon the principle of the dilution of radiactive iodine in human inorganic iodine space. The method permits quantitation of the amount of inorganic iodine with regard to individual features of inorganic space. The method is characterized by simplicity and is not invasive for a patient

  5. Application of multivariable analysis methods to the quantitative detection of gas by tin dioxide micro-sensors; Application des methodes d'analyse multivariables a la detection quantitative de gaz par microcapteurs a base de dioxyde d'etain

    Energy Technology Data Exchange (ETDEWEB)

    Perdreau, N.

    2000-01-17

    The electric conductivity of tin dioxide depends on the temperature of the material and on the nature and environment of the surrounding gas. This work shows that the treatment by multivariable analysis methods of electric conductance signals of one sensor allows to determine concentrations of binary or ternary mixtures of ethanol (0-80 ppm), carbon monoxide (0-300 ppm) and methane (0-1000 ppm). A part of this study has consisted of the design and the implementation of an automatic testing bench allowing to acquire the electric conductance of four sensors in thermal cycle and under gaseous cycles. It has also revealed some disturbing effects (humidity,..) of the measurement. Two techniques of sensor fabrication have been used to obtain conductances (depending of temperature) distinct for each gas, reproducible for the different sensors and enough stable with time to allow the exploitation of the signals by multivariable analysis methods (tin dioxide under the form of thin layers obtained by reactive evaporation or under the form of sintered powder bars). In a last part, it has been shown that the quantitative determination of gas by the application of chemo-metry methods is possible although the relation between the electric conductances in one part and the temperatures and concentrations in another part is non linear. Moreover, the modelling with the 'Partial Least Square' method and a pretreatment allows to obtain performance data comparable to those obtained with neural networks. (O.M.)

  6. Fluorometric method of quantitative cell mutagenesis

    Science.gov (United States)

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  7. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  8. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  9. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  10. Change of time methods in quantitative finance

    CERN Document Server

    Swishchuk, Anatoliy

    2016-01-01

    This book is devoted to the history of Change of Time Methods (CTM), the connections of CTM to stochastic volatilities and finance, fundamental aspects of the theory of CTM, basic concepts, and its properties. An emphasis is given on many applications of CTM in financial and energy markets, and the presented numerical examples are based on real data. The change of time method is applied to derive the well-known Black-Scholes formula for European call options, and to derive an explicit option pricing formula for a European call option for a mean-reverting model for commodity prices. Explicit formulas are also derived for variance and volatility swaps for financial markets with a stochastic volatility following a classical and delayed Heston model. The CTM is applied to price financial and energy derivatives for one-factor and multi-factor alpha-stable Levy-based models. Readers should have a basic knowledge of probability and statistics, and some familiarity with stochastic processes, such as Brownian motion, ...

  11. A Quantitative Method for Localizing User Interface Problems: The D-TEO Method

    Directory of Open Access Journals (Sweden)

    Juha Lamminen

    2009-01-01

    Full Text Available A large array of evaluation methods have been proposed to identify Website usability problems. In log-based evaluation, information about the performance of users is collected and stored into log files, and used to find problems and deficiencies in Web page designs. Most methods require the programming and modeling of large task models, which are cumbersome processes for evaluators. Also, because much statistical data is collected onto log files, recognizing which Web pages require deeper usability analysis is difficult. This paper suggests a novel quantitative method, called the D-TEO, for locating problematic Web pages. This semiautomated method explores the decomposition of interaction tasks of directed information search into elementary operations, deploying two quantitative usability criteria, search success and search time, to reveal how a user navigates within a web of hypertext.

  12. Quantitative O-glycomics based on improvement of the one-pot method for nonreductive O-glycan release and simultaneous stable isotope labeling with 1-(d0/d5)phenyl-3-methyl-5-pyrazolone followed by mass spectrometric analysis.

    Science.gov (United States)

    Wang, Chengjian; Zhang, Ping; Jin, Wanjun; Li, Lingmei; Qiang, Shan; Zhang, Ying; Huang, Linjuan; Wang, Zhongfu

    2017-01-06

    Rapid, simple and versatile methods for quantitative analysis of glycoprotein O-glycans are urgently required for current studies on protein O-glycosylation patterns and the search for disease O-glycan biomarkers. Relative quantitation of O-glycans using stable isotope labeling followed by mass spectrometric analysis represents an ideal and promising technique. However, it is hindered by the shortage of reliable nonreductive O-glycan release methods as well as the too large or too small inconstant mass difference between the light and heavy isotope form derivatives of O-glycans, which results in difficulties during the recognition and quantitative analysis of O-glycans by mass spectrometry. Herein we report a facile and versatile O-glycan relative quantification strategy, based on an improved one-pot method that can quantitatively achieve nonreductive release and in situ chromophoric labeling of intact mucin-type O-glycans in one step. In this study, the one-pot method is optimized and applied for quantitative O-glycan release and tagging with either non-deuterated (d 0 -) or deuterated (d 5 -) 1-phenyl-3-methyl-5-pyrazolone (PMP). The obtained O-glycan derivatives feature a permanent 10-Da mass difference between the d 0 - and d 5 -PMP forms, allowing complete discrimination and comparative quantification of these isotopically labeled O-glycans by mass spectrometric techniques. Moreover, the d 0 - and d 5 -PMP derivatives of O-glycans also have a relatively high hydrophobicity as well as a strong UV adsorption, especially suitable for high-resolution separation and high-sensitivity detection by RP-HPLC-UV. We have refined the conditions for the one-pot reaction as well as the corresponding sample purification approach. The good quantitation feasibility, reliability and linearity of this strategy have been verified using bovine fetuin and porcine stomach mucin as model O-glycoproteins. Additionally, we have also successfully applied this method to the quantitative

  13. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  14. Novel method for quantitative estimation of biofilms

    DEFF Research Database (Denmark)

    Syal, Kirtimaan

    2017-01-01

    Biofilm protects bacteria from stress and hostile environment. Crystal violet (CV) assay is the most popular method for biofilm determination adopted by different laboratories so far. However, biofilm layer formed at the liquid-air interphase known as pellicle is extremely sensitive to its washing...... and staining steps. Early phase biofilms are also prone to damage by the latter steps. In bacteria like mycobacteria, biofilm formation occurs largely at the liquid-air interphase which is susceptible to loss. In the proposed protocol, loss of such biofilm layer was prevented. In place of inverting...... and discarding the media which can lead to the loss of the aerobic biofilm layer in CV assay, media was removed from the formed biofilm with the help of a syringe and biofilm layer was allowed to dry. The staining and washing steps were avoided, and an organic solvent-tetrahydrofuran (THF) was deployed...

  15. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  16. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  17. A new method of linkage analysis using LOD scores for quantitative traits supports linkage of monoamine oxidase activity to D17S250 in the Collaborative Study on the Genetics of Alcoholism pedigrees.

    Science.gov (United States)

    Curtis, David; Knight, Jo; Sham, Pak C

    2005-09-01

    Although LOD score methods have been applied to diseases with complex modes of inheritance, linkage analysis of quantitative traits has tended to rely on non-parametric methods based on regression or variance components analysis. Here, we describe a new method for LOD score analysis of quantitative traits which does not require specification of a mode of inheritance. The technique is derived from the MFLINK method for dichotomous traits. A range of plausible transmission models is constructed, constrained to yield the correct population mean and variance for the trait but differing with respect to the contribution to the variance due to the locus under consideration. Maximized LOD scores under homogeneity and admixture are calculated, as is a model-free LOD score which compares the maximized likelihoods under admixture assuming linkage and no linkage. These LOD scores have known asymptotic distributions and hence can be used to provide a statistical test for linkage. The method has been implemented in a program called QMFLINK. It was applied to data sets simulated using a variety of transmission models and to a measure of monoamine oxidase activity in 105 pedigrees from the Collaborative Study on the Genetics of Alcoholism. With the simulated data, the results showed that the new method could detect linkage well if the true allele frequency for the trait was close to that specified. However, it performed poorly on models in which the true allele frequency was much rarer. For the Collaborative Study on the Genetics of Alcoholism data set only a modest overlap was observed between the results obtained from the new method and those obtained when the same data were analysed previously using regression and variance components analysis. Of interest is that D17S250 produced a maximized LOD score under homogeneity and admixture of 2.6 but did not indicate linkage using the previous methods. However, this region did produce evidence for linkage in a separate data set

  18. Can qualitative and quantitative methods serve complementary purposes for policy research?

    OpenAIRE

    Maxwell, Daniel G.

    1998-01-01

    Qualitative and quantitative methods in social science research have long been separate spheres with little overlap. However, recent innovations have highlighted the complementarity of qualitative and quantitative approaches. The Accra Food and Nutrition Security Study was designed to incorporate the participation of a variety of constituencies in the research, and to rely on a variety of approaches — both qualitative and quantitative — to data collection and analysis. This paper reviews the ...

  19. VERIFICATION HPLC METHOD OF QUANTITATIVE DETERMINATION OF AMLODIPINE IN TABLETS

    Directory of Open Access Journals (Sweden)

    Khanin V. A

    2014-10-01

    Full Text Available Introduction. Amlodipine ((±-2-[(2-aminoetoksimethyl]-4-(2-chlorophenyl-1,4-dihydro-6-methyl-3,5-pyridine dicarboxylic acid 3-ethyl 5-methyl ester as besylate and small tally belongs to the group of selective long-acting calcium channel blockers, dihydropyridine derivatives. In clinical practice, as antianginal and antihypertensive agent for the treatment of cardiovascular diseases. It is produced in powder form, substance and finished dosage forms (tablets of 2.5, 5 and 10 mg. The scientific literature describes methods of quantitative determination of the drug by spectrophotometry – by his own light absorption and by reaction product with aloksan, chromatography techniques, kinetic-spectrophotometric method in substances and preparations and methods chromatomass spectrometry and stripping voltammetry. For the quantitative determination of amlodipine besylate British Pharmacopoeia and European Pharmacopoeia recommend the use of liquid chromatography method. In connection with the establishment of the second edition of SPhU and when it is comprised of articles on the finished product, we set out to analyze the characteristics of the validation of chromatographic quantitative determination of amlodipine besylate tablets and to verify the analytical procedure. Material & methods. In conducting research using substance amlodipine besylate series number AB0401013. Analysis subject pill “Amlodipine” series number 20113 manufacturer of “Pharmaceutical company “Zdorovye”. Analytical equipment used is: 2695 chromatograph with diode array detector 2996 firms Waters Corp. USA using column Nova-Pak C18 300 x 3,9 mm with a particle size of 4 μm, weight ER-182 company AND Japan, measuring vessel class A. Preparation of the test solution. To accurately sample powder tablets equivalent to 50 mg amlodipine, add 30 ml of methanol, shake for 30 minutes, dilute the solution to 50.0 ml with methanol and filtered. 5 ml of methanol solution adjusted to

  20. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  1. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  2. In vivo quantitative whole-brain diffusion tensor imaging analysis of APP/PS1 transgenic mice using voxel-based and atlas-based methods

    International Nuclear Information System (INIS)

    Qin, Yuan-Yuan; Li, Mu-Wei; Oishi, Kenichi; Zhang, Shun; Zhang, Yan; Zhao, Ling-Yun; Zhu, Wen-Zhen; Lei, Hao

    2013-01-01

    Diffusion tensor imaging (DTI) has been applied to characterize the pathological features of Alzheimer's disease (AD) in a mouse model, although little is known about whether these features are structure specific. Voxel-based analysis (VBA) and atlas-based analysis (ABA) are good complementary tools for whole-brain DTI analysis. The purpose of this study was to identify the spatial localization of disease-related pathology in an AD mouse model. VBA and ABA quantification were used for the whole-brain DTI analysis of nine APP/PS1 mice and wild-type (WT) controls. Multiple scalar measurements, including fractional anisotropy (FA), trace, axial diffusivity (DA), and radial diffusivity (DR), were investigated to capture the various types of pathology. The accuracy of the image transformation applied for VBA and ABA was evaluated by comparing manual and atlas-based structure delineation using kappa statistics. Following the MR examination, the brains of the animals were analyzed for microscopy. Extensive anatomical alterations were identified in APP/PS1 mice, in both the gray matter areas (neocortex, hippocampus, caudate putamen, thalamus, hypothalamus, claustrum, amygdala, and piriform cortex) and the white matter areas (corpus callosum/external capsule, cingulum, septum, internal capsule, fimbria, and optic tract), evidenced by an increase in FA or DA, or both, compared to WT mice (p 0.05). The histopathological changes in the gray matter areas were confirmed by microscopy studies. DTI did, however, demonstrate significant changes in white matter areas, where the difference was not apparent by qualitative observation of a single-slice histological specimen. This study demonstrated the structure-specific nature of pathological changes in APP/PS1 mouse, and also showed the feasibility of applying whole-brain analysis methods to the investigation of an AD mouse model. (orig.)

  3. New 'ex vivo' radioisotopic method of quantitation of platelet deposition

    International Nuclear Information System (INIS)

    Badimon, L.; Mayo Clinic, Rochester, MN; Thrombosis and Atherosclerosis Unit, Barcelona; Mayo Clinic, Rochester, MN; Fuster, V.; Chesebro, J.H.; Dewanjee, M.K.

    1983-01-01

    We have developed a sensitive and quantitative method of 'ex vivo' evaluation of platelet deposition on collagen strips, from rabbit Achilles tendon, superfused by flowing blood and applied it to four animal species, cat, rabbit, dog and pig. Autologous platelets were labeled with indium-111-tropolone, injected to the animal 24 hr before the superfusion and the number of deposited platelets was quantitated from the tendon gamma-radiation and the blood platelet count. We detected some platelet consumption with superfusion time when blood was reinfused entering the contralateral jugular vein after collagen contact but not if blood was discarded after the contact. Therefore, in order to have a more physiological animal model we decided to discard blood after superfusion of the tendon. In all species except for the cat there was a linear relationship between increase of platelet on the tendon and time of exposure to blood superfusion. The highest number of platelets deposited on the collagen was found in cats, the lowest in dogs. Ultrastructural analysis showed the platelets were deposited as aggregates after only 5 min of superfusion. (orig.)

  4. Quantitative analysis of sodium di-uranate for Al, Ca, Fe, Mg, Mn, Na by flame-atomic absorption spectrometric method

    International Nuclear Information System (INIS)

    Jat, J.R.; Balaji Rao, Y.; Subba Rao, Y.

    2015-01-01

    Nuclear Fuel Complex (NFC) receives Sodium Di-Uranate (SDU) from Uranium Corporation of India Limited (UCIL) for producing sinterable UO 2 pellets for manufacturing fuel sub assemblies. Several impurities present in ore find their way into SDU during its conversion. Stringent specification have been laid down by the reactor designs for achieving the optimum performance of the fuel and several impurity element like Al, Ca, Fe, Mg, Mn, Na among others affects severely performance of UO 2 fuel. Most of the impurity including the above mentioned elements are generally analysed by ICP-OES method. However, determination of Al, Ca, Fe, Mg, Mn and Na by ICP-OES requires lot of dilution as they are present at high levels in SDU. Apart from introducing dilution error, dilution process is very tedious and time consuming work and not a preferred choice in an industrial lab like control lab where large analytical load exists and time bound analysis is a requirement. To avoid these difficulties a simple and reliable Flame Atomic absorption spectrometric technique has been developed for regular analysis. Present method involves dissolution of SDU sample in Conc. HNO 3 and after the complete dissolution the sample solution has been evaporated to near dryness on a hot plate. Subsequently sample solution has been brought into 4N HNO 3 medium

  5. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  6. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  7. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  8. Corrosion of non-irradiated UAl{sub x}-Al fuel in the presence of clay pore solution. A quantitative XRD secondary phase analysis applying the DDM method

    Energy Technology Data Exchange (ETDEWEB)

    Neumann, Andreas [Halle-Wittenberg Univ. (Germany). Dept. of Mineralogy and Geochemistry; RWTH Aachen Univ. (Germany). Inst. of Crystallography; Klinkenberg, Martina; Curtius, Hildegard [Forschungszentrum Juelich GmbH (Germany). Inst. of Energy and Climate Research, IEK-6 Nuclear Waste Management

    2017-04-01

    Corrosion experiments with non-irradiated metallic UAl{sub x}-Al research reactor fuel elements were carried out in autoclaves to identify and quantify the corrosion products. Such compounds, considering the long-term safety assessment of final repositories, can interact with the released inventory and this constitutes a sink for radionuclide migration in formation waters. Therefore, the metallic fuel sample was subjected to clay pore solution to investigate its process of disintegration by analyzing the resulting products and the remnants, i.e. the secondary phases. Due to the fast corrosion rate a full sample disintegration was observed within the experimental period of 1 year at 90 C. The obtained solids were subdivided into different grain size fractions and prepared for analysis. The elemental analysis of the suspension showed that, uranium and aluminum are concentrated in the solids, whereas iron was mainly dissolved. Non-ambient X-ray diffraction (XRD) combined with the derivative difference minimization (DDM) method was applied for the qualitative and quantitative phase analysis (QPA) of the secondary phases. Gypsum and hemihydrate (bassanite), residues of non-corroded nuclear fuel, hematite, and goethite were identified. The quantitative phase analysis showed that goethite is the major crystalline phase. The amorphous content exceeded 80 wt% and hosted the uranium. All other compounds were present to a minor content. The obtained results by XRD were well supported by complementary scanning electron microscopy (SEM) and energy dispersive X-ray spectroscopy (EDS) analysis.

  9. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...

  10. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  11. In vivo quantitative whole-brain diffusion tensor imaging analysis of APP/PS1 transgenic mice using voxel-based and atlas-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Qin, Yuan-Yuan [Huazhong University of Science and Technology, Department of Radiology, Tongji Hospital, Tongji Medical College, Wuhan (China); The Johns Hopkins University School of Medicine, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Li, Mu-Wei; Oishi, Kenichi [The Johns Hopkins University School of Medicine, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Zhang, Shun; Zhang, Yan; Zhao, Ling-Yun; Zhu, Wen-Zhen [Huazhong University of Science and Technology, Department of Radiology, Tongji Hospital, Tongji Medical College, Wuhan (China); Lei, Hao [Chinese Academy of Sciences, Wuhan Center for Magnetic Resonance, State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Wuhan (China)

    2013-08-15

    feasibility of applying whole-brain analysis methods to the investigation of an AD mouse model. (orig.)

  12. Quantitative methods in electroencephalography to access therapeutic response.

    Science.gov (United States)

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  13. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...

  14. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  15. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  16. A method for the quantitative determination of crystalline phases by X-ray

    Science.gov (United States)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  17. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  18. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  19. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  20. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  1. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  2. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  3. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  4. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  5. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  6. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  7. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  8. A Quantitative Analytical Method to Test for Salt Effects on Giant Unilamellar Vesicles

    DEFF Research Database (Denmark)

    Hadorn, Maik; Bönzli, Eva; Eggenberger Hotz, Peter

    2011-01-01

    preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes....

  9. A CT-based method for fully quantitative TI SPECT

    International Nuclear Information System (INIS)

    Willowson, Kathy; Bailey, Dale; Baldock, Clive

    2009-01-01

    Full text: Objectives: To develop and validate a method for quantitative 2 0 l TI SPECT data based on corrections derived from X-ray CT data, and to apply the method in the clinic for quantitative determination of recurrence of brain tumours. Method: A previously developed method for achieving quantitative SPECT with 9 9 m Tc based on corrections derived from xray CT data was extended to apply to 2 0 l Tl. Experimental validation was performed on a cylindrical phantom by comparing known injected activity and measured concentration to quantitative calculations. Further evaluation was performed on a RSI Striatal Brain Phantom containing three 'lesions' with activity to background ratios of 1: 1, 1.5: I and 2: I. The method was subsequently applied to a series of scans from patients with suspected recurrence of brain tumours (principally glioma) to determine an SUV-like measure (Standardised Uptake Value). Results: The total activity and concentration in the phantom were calculated to within 3% and I % of the true values, respectively. The calculated values for the concentration of activity in the background and corresponding lesions of the brain phantom (in increasing ratios) were found to be within 2%,10%,1% and 2%, respectively, of the true concentrations. Patient studies showed that an initial SUV greater than 1.5 corresponded to a 56% mortality rate in the first 12 months, as opposed to a 14% mortality rate for those with a SUV less than 1.5. Conclusion: The quantitative technique produces accurate results for the radionuclide 2 0 l Tl. Initial investigation in clinical brain SPECT suggests correlation between quantitative uptake and survival.

  10. Development of method quantitative content of dihydroquercetin. Report 1

    Directory of Open Access Journals (Sweden)

    Олександр Юрійович Владимиров

    2016-01-01

    Full Text Available Today is markedly increasing scientific interest in the study of flavonoids in plant objects due to their high biological activity. In this regard, the urgent task of analytical chemistry is in developing available analytical techniques of determination for flavonoids in plant objects.Aim. The aim was to develop specific techniques of quantitative determination for dihydroquercetin and determination of its validation characteristics.Methods. The technique for photocolorimetric quantification of DQW, which was based on the specific reaction of cyanidine chloride formation when added zinc powder to dihydroquercetin solution in an acidic medium has been elaborated.Results. Photocolorimetric technique of determining flavonoids recalculating on DQW has been developed, its basic validation characteristics have been determined. The obtained metrological characteristics of photocolorimetric technique for determining DQW did not exceed admissibility criteria in accordance with the requirements of the State Pharmacopoeia of Ukraine.Conclusions. By the results of statistical analysis of experimental data, it has been stated that the developed technique can be used for quantification of DQW. Metrological data obtained indicate that the method reproduced in conditions of two different laboratories with confidence probability 95 % unit value deviation was 101,85±2,54 %

  11. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  12. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  13. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  14. qualitative and quantitative methods of suicide research in old age

    African Journals Online (AJOL)

    concludes that an integration of both the qualitative and quantitative research approaches may provide a better platform for unraveling the complex phenomenon of suicide in the elderly. Keywords: Psychological autopsy, Suicidal behaviours, Thematic analysis, Phenomenology the population, or a systematically generated ...

  15. Quantitative methods for stochastic high frequency spatio-temporal and non-linear analysis: Assessing health effects of exposure to extreme ambient temperature

    Science.gov (United States)

    Liss, Alexander

    Extreme weather events, such as heat waves and cold spells, cause substantial excess mortality and morbidity in the vulnerable elderly population, and cost billions of dollars. The accurate and reliable assessment of adverse effects of extreme weather events on human health is crucial for environmental scientists, economists, and public health officials to ensure proper protection of vulnerable populations and efficient allocation of scarce resources. However, the methodology for the analysis of large national databases is yet to be developed. The overarching objective of this dissertation is to examine the effect of extreme weather on the elderly population of the Conterminous US (ConUS) with respect to seasonality in temperature in different climatic regions by utilizing heterogeneous high frequency and spatio-temporal resolution data. To achieve these goals the author: 1) incorporated dissimilar stochastic high frequency big data streams and distinct data types into the integrated data base for use in analytical and decision support frameworks; 2) created an automated climate regionalization system based on remote sensing and machine learning to define climate regions for the Conterminous US; 3) systematically surveyed the current state of the art and identified existing gaps in the scientific knowledge; 4) assessed the dose-response relationship of exposure to temperature extremes on human health in relatively homogeneous climate regions using different statistical models, such as parametric and non-parametric, contemporaneous and asynchronous, applied to the same data; 5) assessed seasonal peak timing and synchronization delay of the exposure and the disease within the framework of contemporaneous high frequency harmonic time series analysis and modification of the effect by the regional climate; 6) modeled using hyperbolic functional form non-linear properties of the effect of exposure to extreme temperature on human health. The proposed climate

  16. Quantitative analysis of deuterium by gas chromatography

    International Nuclear Information System (INIS)

    Isomura, Shohei; Kaetsu, Hayato

    1981-01-01

    An analytical method for the determination of deuterium concentration in water and hydrogen gas by gas chromatography is described. HD and D 2 in a hydrogen gas sample were separated from H 2 by a column packed with Molecular Sieve 13X, using extra pure hydrogen gas as carrier. A thermal conductivity detector was used. Concentrations of deuterium were determined by comparison with standard samples. The error inherent to the present method was less a 1% on the basis of the calibration curves obtained with the standard samples. The average time required for the analysis was about 3 minutes. (author)

  17. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  18. DREAM: a method for semi-quantitative dermal exposure assessment

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Brouwer, D.H.; Kromhout, H.; Hemmen, J.J. van

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others,

  19. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  20. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  1. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  2. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  3. Validation of an enhanced knowledge-based method for segmentation and quantitative analysis of intrathoracic airway trees from three-dimensional CT images

    International Nuclear Information System (INIS)

    Sonka, M.; Park, W.; Hoffman, E.A.

    1995-01-01

    Accurate assessment of airway physiology, evaluated in terms of geometric changes, is critically dependent upon the accurate imaging and image segmentation of the three-dimensional airway tree structure. The authors have previously reported a knowledge-based method for three-dimensional airway tree segmentation from high resolution CT (HRCT) images. Here, they report a substantially improved version of the method. In the current implementation, the method consists of several stages. First, the lung borders are automatically determined in the three-dimensional set of HRCT data. The primary airway tree is semi-automatically identified. In the next stage, potential airways are determined in individual CT slices using a rule-based system that uses contextual information and a priori knowledge about pulmonary anatomy. Using three-dimensional connectivity properties of the pulmonary airway tree, the three-dimensional tree is constructed from the set of adjacent slices. The method's performance and accuracy were assessed in five 3D HRCT canine images. Computer-identified airways matched 226/258 observer-defined airways (87.6%); the computer method failed to detect the airways in the remaining 32 locations. By visual assessment of rendered airway trees, the experienced observers judged the computer-detected airway trees as highly realistic

  4. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    Science.gov (United States)

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  5. A solid-phase microextraction GC/MS/MS method for rapid quantitative analysis of food and beverages for the presence of legally restricted biologically active flavorings.

    Science.gov (United States)

    Bousova, Katerina; Mittendorf, Klaus; Senyuva, Hamide

    2011-01-01

    A method was developed using automated headspace solid-phase microextraction coupled with GC/MS/MS to simultaneously determine the presence of seven biologically active flavoring substances whose levels of use in processed foods is controlled by statutory limits. The method can be applied to identify and quantify the presence of 1,2-benzopyrone (coumarin), beta-asarone, 1-allyl-4-methoxybenzene (estragole), menthofuran, 4-allyl-1 ,2-dimethoxybenzene (methyl eugenol), pulegone, and thujone at levels ranging from 0.5 to 3000 mg/kg. The method has been optimized and validated for three different generic food types categorized on the basis of composition and anticipated use levels of flavorings and food ingredients. The food categories are alcoholic and nonalcoholic beverages; semisolid processed foods (e.g., soups, sauces, confectionary, etc.); and solid foods (muesli, bakery products, etc.). The method is simple, inexpensive, and rapid, and eliminates the use of flammable and toxic solvents. There is no sample preparation, and using MSIMS, unequivocal confirmation of identification is achieved even in highly complex matrixes containing many potential interfering volatiles. The method precision for spiked samples ranged from 2 to 21%, with the greatest variability associated with solid matrixes. The LOD and LOQ values were well below 0.1 and 0.5 mg/kg, respectively, in all cases for individual substances, fulfilling requirements for enforcement purposes. The robustness of the method was demonstrated in a small survey of retail samples of four spirits, five flavored milks, three energy drinks, five liqueurs, five soups, 10 sauces, five herbal teas, and three breakfast cereals.

  6. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  7. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  8. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  9. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  10. Development of quantitative analysis method for mRNA in Mycobacterium leprae and slow-growing acid-fast bacteria using radioisotope

    International Nuclear Information System (INIS)

    Nakanaga, Kazue; Maeda, Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko

    2000-01-01

    Since RNase protection assay (RPA) system for specific detection of mRNA from M. lepra was established in the previous year, modification of the system was attempted to detect a trace amount of mRNA in this study. Thus, RNA amplification was examined using nucleic aid sequence-based amplification method (NASBA). Since 32 P CTP was used as an isotope for synthesis of anti-sense RNA probe in the previous method, the label compound was exchanged to that with a lower energy in this study, resulting that the half life of the probe was increased and handling of the probe became easier. Several short bands consisting of 100-130b were detected in total RNA sample of M.marinum and M.choelonae by RPA using T1 probe (194-762, 580b). Whereas the new probe M1 detected longer bands of about 350b from M.marinum RNA and of 250b from M.chelonae, M. bovis BCG and M. kansaii. However, T1 probe was more suitable for specific detection of M.leprae hsp 65 than M1 probe because high and low homogeneous regions are coexisting in the gene. Specific mRNA was detectable from only 3 pg of total RNA by the use of NASBA. RNA recovery for QIAGEN was about 50%, however, the sensitivity of NASBA method was estimated to be several ten to hundred thousands times higher, suggesting that this method is very effective for detection and determination of trace amount of mRNA. (M.N.)

  11. Development of quantitative analysis method for mRNA in Mycobacterium leprae and slow-growing acid-fast bacteria using radioisotope

    Energy Technology Data Exchange (ETDEWEB)

    Nakanaga, Kazue; Maeda, Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko [National Inst. of Infectious Deseases, Tokyo (Japan)

    2000-02-01

    Since RNase protection assay (RPA) system for specific detection of mRNA from M. lepra was established in the previous year, modification of the system was attempted to detect a trace amount of mRNA in this study. Thus, RNA amplification was examined using nucleic aid sequence-based amplification method (NASBA). Since {sup 32}P CTP was used as an isotope for synthesis of anti-sense RNA probe in the previous method, the label compound was exchanged to that with a lower energy in this study, resulting that the half life of the probe was increased and handling of the probe became easier. Several short bands consisting of 100-130b were detected in total RNA sample of M.marinum and M.choelonae by RPA using T1 probe (194-762, 580b). Whereas the new probe M1 detected longer bands of about 350b from M.marinum RNA and of 250b from M.chelonae, M. bovis BCG and M. kansaii. However, T1 probe was more suitable for specific detection of M.leprae hsp 65 than M1 probe because high and low homogeneous regions are coexisting in the gene. Specific mRNA was detectable from only 3 pg of total RNA by the use of NASBA. RNA recovery for QIAGEN was about 50%, however, the sensitivity of NASBA method was estimated to be several ten to hundred thousands times higher, suggesting that this method is very effective for detection and determination of trace amount of mRNA. (M.N.)

  12. A simple LC-MS/MS method for quantitative analysis of underivatized neurotransmitters in rats urine: assay development, validation and application in the CUMS rat model.

    Science.gov (United States)

    Zhai, Xue-jia; Chen, Fen; Zhu, Chao-ran; Lu, Yong-ning

    2015-11-01

    Many amino acid neurotransmitters in urine are associated with chronic stress as well as major depressive disorders. To better understand depression, an analytical LC-MS/MS method for the simultaneous determination of 11 underivatized neurotransmitters (4-aminohippurate, 5-HIAA, glutamate, glutamine, hippurate, pimelate, proline, tryptophan, tyramine, tyrosine and valine) in a single analytical run was developed. The advantage of this method is the simple preparation in that there is no need to deconjugate the urine samples. The quantification range was 25-12,800 ng mL(-1) with >85.8% recovery for all analytes. The nocturnal urine concentrations of the 11 neurotransmitters in chronic unpredictable mild stress (CUMS) model rats and control group (n = 12) were analyzed. A series of significant changes in urinary excretion of neurotransmitters could be detected: the urinary glutamate, glutamine, hippurate and tyramine concentrations were significantly lower in the CUMS group. In addition, the urinary concentrations of tryptophan as well as tyrosine were significantly higher in chronically stressed rats. This method allows the assessment of the neurotransmitters associated with CUMS in rat urine in a single analytical run, making it suitable for implementation as a routine technique in depression research. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Liver volume, intrahepatic fat and body weight in the course of a lifestyle interventional study. Analysis with quantitative MR-based methods

    International Nuclear Information System (INIS)

    Bongers, M.N.; Stefan, N.; Fritsche, A.; Haering, H.U.; Nikolaou, K.; Schick, F.; Machann, J.

    2015-01-01

    The aim of this study was to investigate potential associations between changes in liver volume, the amount of intrahepatic lipids (IHL) and body weight during lifestyle interventions. In a prospective study 150 patients with an increased risk for developing type 2 diabetes mellitus were included who followed a caloric restriction diet for 6 months. In the retrospective analysis 18 women and 9 men (age range 22-71 years) with an average body mass index (BMI) of 32 kg/m 2 were enrolled. The liver volume was determined at the beginning and after 6 months by three-dimensional magnetic resonance imaging (3D-MRI, echo gradient, opposed-phase) and IHLs were quantified by volume-selective MR spectroscopy in single voxel stimulated echo acquisition mode (STEAM). Univariable and multivariable correlation analyses between changes of liver volume (Δliver volume), intrahepatic lipids (ΔIHL) and body weight (ΔBW) were performed. Univariable correlation analysis in the whole study cohort showed associations between ΔIHL and ΔBW (r = 0.69; p < 0.0001), ΔIHL and Δliver volume (r = 0.66; p = 0.0002) as well as ΔBW and Δliver volume (r = 0.5; p = 0.0073). Multivariable correlation analysis revealed that changes of liver volume are primarily determined by changes in IHL independent of changes in body weight (β = 0.0272; 95 % CI: 0.0155-0.034; p < 0.0001). Changes of liver volume during lifestyle interventions are independent of changes of body weight primarily determined by changes of IHL. These results show the reversibility of augmented liver volume in steatosis if it is possible to reduce IHLs during lifestyle interventions. (orig.) [de

  14. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. A simple and robust quantitative analysis of retinol and retinyl palmitate using a liquid chromatographic isocratic method

    Directory of Open Access Journals (Sweden)

    Satoshi Yokota

    2018-04-01

    Full Text Available Vitamin A is a vital nutritional substances that regulates biological activities including development, but is also associated with disease onset. The extent of vitamin A intake influences the retinoid content in the liver, the most important organ for the storage of vitamin A. Measurement of endogenous retinoid in biological samples is important to understand retinoid homeostasis. Here we present a reliable, highly sensitive, and robust method for the quantification of retinol and retinyl palmitate using a reverse-phase HPLC/UV isocratic method. We determined the impact of chronic dietary vitamin A on retinoid levels in livers of mice fed an AIN-93G semi-purified diet (4 IU/g compared with an excess vitamin A diet (1000 IU/g over a period from birth to 10 weeks of age. Coefficients of variation for intra-assays for both retinoids were less than 5%, suggesting a higher reproducibility than any other HPLC/UV gradient method. Limits of detection and quantification for retinol were 0.08 pmol, and 0.27 pmol, respectively, which are remarkably higher than previous results. Supplementation with higher doses of vitamin A over the study period significantly increased liver retinol and retinyl palmitate concentrations in adult mice. The assays described here provide a sensitive and rigorous quantification of endogenous retinol and retinyl palmitate, which can be used to help determine retinoid homeostasis in disease states, such as toxic hepatitis and liver cancer. Keywords: Vitamin A excess, Retinol, Retinyl palmitate, Liver

  17. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  18. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  19. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  20. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  1. Quantitative numerical method for analysing slip traces observed by AFM

    International Nuclear Information System (INIS)

    Veselý, J; Cieslar, M; Coupeau, C; Bonneville, J

    2013-01-01

    Atomic force microscopy (AFM) is used more and more routinely to study, at the nanometre scale, the slip traces produced on the surface of deformed crystalline materials. Taking full advantage of the quantitative height data of the slip traces, which can be extracted from these observations, requires however an adequate and robust processing of the images. In this paper an original method is presented, which allows the fitting of AFM scan-lines with a specific parameterized step function without any averaging treatment of the original data. This yields a quantitative and full description of the changes in step shape along the slip trace. The strength of the proposed method is established on several typical examples met in plasticity by analysing nano-scale structures formed on the sample surface by emerging dislocations. (paper)

  2. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  3. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  4. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  5. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  6. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  7. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  8. Quantitative analysis of aortic regurgitation: real-time 3-dimensional and 2-dimensional color Doppler echocardiographic method--a clinical and a chronic animal study

    Science.gov (United States)

    Shiota, Takahiro; Jones, Michael; Tsujino, Hiroyuki; Qin, Jian Xin; Zetts, Arthur D.; Greenberg, Neil L.; Cardon, Lisa A.; Panza, Julio A.; Thomas, James D.

    2002-01-01

    BACKGROUND: For evaluating patients with aortic regurgitation (AR), regurgitant volumes, left ventricular (LV) stroke volumes (SV), and absolute LV volumes are valuable indices. AIM: The aim of this study was to validate the combination of real-time 3-dimensional echocardiography (3DE) and semiautomated digital color Doppler cardiac flow measurement (ACM) for quantifying absolute LV volumes, LVSV, and AR volumes using an animal model of chronic AR and to investigate its clinical applicability. METHODS: In 8 sheep, a total of 26 hemodynamic states were obtained pharmacologically 20 weeks after the aortic valve noncoronary (n = 4) or right coronary (n = 4) leaflet was incised to produce AR. Reference standard LVSV and AR volume were determined using the electromagnetic flow method (EM). Simultaneous epicardial real-time 3DE studies were performed to obtain LV end-diastolic volumes (LVEDV), end-systolic volumes (LVESV), and LVSV by subtracting LVESV from LVEDV. Simultaneous ACM was performed to obtain LVSV and transmitral flows; AR volume was calculated by subtracting transmitral flow volume from LVSV. In a total of 19 patients with AR, real-time 3DE and ACM were used to obtain LVSVs and these were compared with each other. RESULTS: A strong relationship was found between LVSV derived from EM and those from the real-time 3DE (r = 0.93, P <.001, mean difference (3D - EM) = -1.0 +/- 9.8 mL). A good relationship between LVSV and AR volumes derived from EM and those by ACM was found (r = 0.88, P <.001). A good relationship between LVSV derived from real-time 3DE and that from ACM was observed (r = 0.73, P <.01, mean difference = 2.5 +/- 7.9 mL). In patients, a good relationship between LVSV obtained by real-time 3DE and ACM was found (r = 0.90, P <.001, mean difference = 0.6 +/- 9.8 mL). CONCLUSION: The combination of ACM and real-time 3DE for quantifying LV volumes, LVSV, and AR volumes was validated by the chronic animal study and was shown to be clinically applicable.

  9. Evaluation of the Possibility of Applying Spatial 3D Imaging Using X-Ray Computed Tomography Reconstruction Methods for Quantitative Analysis of Multiphase Materials / Rentgenowska Analiza Ilościowa Materiałów Wielofazowych Z Wykorzystaniem Przestrzennego Obrazowania (3D Przy Użyciu Metod Rekonstrukcji Tomografii Komputerowej

    Directory of Open Access Journals (Sweden)

    Matysik P.

    2015-12-01

    Full Text Available In this paper the possibility of using X-ray computed tomography (CT in quantitative metallographic studies of homogeneous and composite materials is presented. Samples of spheroidal cast iron, Fe-Ti powder mixture compact and epoxy composite reinforced with glass fibers, were subjected to comparative structural tests. Volume fractions of each of the phase structure components were determined by conventional methods with the use of a scanning electron microscopy (SEM and X-ray diffraction (XRD quantitative analysis methods. These results were compared with those obtained by the method of spatial analysis of the reconstructed CT image. Based on the comparative analysis, taking into account the selectivity of data verification methods and the accuracy of the obtained results, the authors conclude that the method of computed tomography is suitable for quantitative analysis of several types of structural materials.

  10. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  11. Application of bias correction methods to improve U{sub 3}Si{sub 2} sample preparation for quantitative analysis by WDXRF

    Energy Technology Data Exchange (ETDEWEB)

    Scapin, Marcos A.; Guilhen, Sabine N.; Azevedo, Luciana C. de; Cotrim, Marycel E.B.; Pires, Maria Ap. F., E-mail: mascapin@ipen.br, E-mail: snguilhen@ipen.br, E-mail: lvsantana@ipen.br, E-mail: mecotrim@ipen.br, E-mail: mapires@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The determination of silicon (Si), total uranium (U) and impurities in uranium-silicide (U{sub 3}Si{sub 2}) samples by wavelength dispersion X-ray fluorescence technique (WDXRF) has been already validated and is currently implemented at IPEN's X-Ray Fluorescence Laboratory (IPEN-CNEN/SP) in São Paulo, Brazil. Sample preparation requires the use of approximately 3 g of H{sub 3}BO{sub 3} as sample holder and 1.8 g of U{sub 3}Si{sub 2}. However, because boron is a neutron absorber, this procedure precludes U{sub 3}Si{sub 2} sample's recovery, which, in time, considering routinely analysis, may account for significant unusable uranium waste. An estimated average of 15 samples per month are expected to be analyzed by WDXRF, resulting in approx. 320 g of U{sub 3}Si{sub 2} that would not return to the nuclear fuel cycle. This not only impacts in production losses, but generates another problem: radioactive waste management. The purpose of this paper is to present the mathematical models that may be applied for the correction of systematic errors when H{sub 3}BO{sub 3} sample holder is substituted by cellulose-acetate {[C_6H_7O_2(OH)_3_-_m(OOCCH_3)m], m = 0∼3}, thus enabling U{sub 3}Si{sub 2} sample’s recovery. The results demonstrate that the adopted mathematical model is statistically satisfactory, allowing the optimization of the procedure. (author)

  12. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  13. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  14. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  15. Simple PVT quantitative method of Kr under high pure N2 condition

    International Nuclear Information System (INIS)

    Li Xuesong; Zhang Zibin; Wei Guanyi; Chen Liyun; Zhai Lihua

    2005-01-01

    A simple PVT quantitative method of Kr in the high pure N 2 was studied. Pressure, volume and temperature of the sample gas were measured by three individual methods to obtain the sum sample with food uncertainty. The ratio of Kr/N 2 could measured by GAM 400 quadrupole mass spectrometer. So the quantity of Kr could be calculated with the two measured data above. This method can be suited for quantitative analysis of other simple composed noble gas sample with high pure carrying gas. (authors)

  16. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  17. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  18. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Comparaison de diverses méthodes de dosage des argiles d'un sable de gisement. Dosage des argiles Comparison of Different Methods of Determining Clays in a Reservoir Sand. Quantitative Analysis of Clays

    Directory of Open Access Journals (Sweden)

    Yvon J.

    2006-11-01

    Full Text Available Les argiles d'un sable de gisement, concentrées dans la fraction de diamètre Phi Oil, gas and geothermal reservoirs all contain clayey fractions no matter how small they may be. This has been blamed whenever operating or producing problems arise. It may be revealed by phenomena of mechanical resistance, permeability or interfacial properties (ion exchange, adsorption, etc. . Tests to understand such phenomena then go via the quantitative mineralogical analysis of the clays present. This analysis must also be looked at in terms of methods. It is subjected to constraints of cost, instrumentation, competence or deadlines. This article proposes:(a A so-called conventional route (Dejou et al, 1977 based on chemical and weighted analyses. (b An overall assessment method of the clay phase by difference (determination of two nonclay species. (c A method based on the statistical processing of microanalytic data obtained by an electronic microprobe. The material examined was a quartzose arenite made up mainly of quartz, jarosite, orthoclase, plagioclases, calcite, dolomite, muscovite, kaolinite, illite, montmorillonite, interstratified illite-montmorillionite, iron oxyhydroxides and accessory minerals such as rutile, zircon, garnet, tourmaline and hydroxylapatite. The arenite was subjected to an ultrasonic treatment (Letelier, 1986 to recover pellicular or weakly cemented clays. After this treatment, all the free clays were found in the < 40 m fraction which were used for the measurements. The so-called conventionalmethod is based on the associating of multiple techniques that are normally used for analyzing clays. They include X-ray diffraction, TDA, TGA, selective dissolution, CEC, adsorption of various reagents and gravimetric separations. They have been reviewed by Dejou et al (1977. The results they give depend on the grain size, chrystallochemistry, presence of amorphous elements and especially the typical chemical compositions assigned to the

  20. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  1. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  2. Transportation forecasting : analysis and quantitative methods

    Science.gov (United States)

    1983-01-01

    This Record contains the following papers: Development of Survey Instruments Suitable for Determining Non-Home Activity Patterns; Sequential, History-Dependent Approach to Trip-Chaining Behavior; Identifying Time and History Dependencies of Activity ...

  3. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  4. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    Science.gov (United States)

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  5. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  6. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  7. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  8. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  9. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  10. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  11. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  12. Moessbauer methods and spectrometers specialized in qualitative and quantitative determinations

    International Nuclear Information System (INIS)

    Bibicu, I.

    1981-01-01

    A portable, field analyser devoted to fast qualitative and quantitative determination of cassiterite, which is the only tin compound of economic use in industry was made. This analyser differs from other similar ones described in the literature through the fact that pulses are cumulated only as long as the vibrator speed ensures a complete cancelation of the resonance condition. A NIM analyser was also manufactured which allows a qualitative determination of 2-3 iron compounds in a given sample, as well as of the total iron content. Working in geometry transmission and constant velocity regime, this analyser joins the inconstancy of a laboratory Moessbauer spectrometer with the swiftness and simplity of a specialized Moessbauer spectrometer. Analysis of the main factor that affects the qualitative and quantitative cassiterite determinations: sample structural composition, spectrometer calibrating, a.s.o. Determination accuracy is similar to those reported in the literature. Sample structural composition must be smaller than 0.1 mm for the qualitative and quantitative determinations of iron or total iron compounds. Data accuracy is similar to those reported before, but obtained by means of area measurements. As a consequence of the previous results, we have looked for the existence of some new phases in industrial Fe-C and Fe-Si steels or some new compounds appearance in the samples subjected to coroding. (author)

  13. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  14. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  16. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    Science.gov (United States)

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  18. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  20. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. An improved fast neutron radiography quantitative measurement method

    International Nuclear Information System (INIS)

    Matsubayashi, Masahito; Hibiki, Takashi; Mishima, Kaichiro; Yoshii, Koji; Okamoto, Koji

    2004-01-01

    The validity of a fast neutron radiography quantification method, the Σ-scaling method, which was originally proposed for thermal neutron radiography was examined with Monte Carlo calculations and experiments conducted at the YAYOI fast neutron source reactor. Water and copper were selected as comparative samples for a thermal neutron radiography case and a dense object, respectively. Although different characteristics on effective macroscopic cross-sections were implied by the simulation, the Σ-scaled experimental results with the fission neutron spectrum cross-sections were well fitted to the measurements for both the water and copper samples. This indicates that the Σ-scaling method could be successfully adopted for quantitative measurements in fast neutron radiography

  2. A quantitative method for assessing resilience of interdependent infrastructures

    International Nuclear Information System (INIS)

    Nan, Cen; Sansavini, Giovanni

    2017-01-01

    The importance of understanding system resilience and identifying ways to enhance it, especially for interdependent infrastructures our daily life depends on, has been recognized not only by academics, but also by the corporate and public sectors. During recent years, several methods and frameworks have been proposed and developed to explore applicable techniques to assess and analyze system resilience in a comprehensive way. However, they are often tailored to specific disruptive hazards/events, or fail to properly include all the phases such as absorption, adaptation, and recovery. In this paper, a quantitative method for the assessment of the system resilience is proposed. The method consists of two components: an integrated metric for system resilience quantification and a hybrid modeling approach for representing the failure behavior of infrastructure systems. The feasibility and applicability of the proposed method are tested using an electric power supply system as the exemplary infrastructure. Simulation results highlight that the method proves effective in designing, engineering and improving the resilience of infrastructures. Finally, system resilience is proposed as a proxy to quantify the coupling strength between interdependent infrastructures. - Highlights: • A method for quantifying resilience of interdependent infrastructures is proposed. • It combines multi-layer hybrid modeling and a time-dependent resilience metric. • The feasibility of the proposed method is tested on the electric power supply system. • The method provides insights to decision-makers for strengthening system resilience. • Resilience capabilities can be used to engineer interdependencies between subsystems.

  3. Quantitative Clinical Imaging Methods for Monitoring Intratumoral Evolution.

    Science.gov (United States)

    Kim, Joo Yeun; Gatenby, Robert A

    2017-01-01

    images in landscape ecology and, with appropriate application of Darwinian first principles and sophisticated image analytic methods, can be used to estimate regional variations in the molecular properties of cancer cells.We have initially examined this technique in glioblastoma, a malignant brain neoplasm which is morphologically complex and notorious for a fast progression from diagnosis to recurrence and death, making a suitable subject of noninvasive, rapidly repeated assessment of intratumoral evolution. Quantitative imaging analysis of routine clinical MRIs from glioblastoma has identified macroscopic morphologic characteristics which correlate with proteogenomics and prognosis. The key to the accurate detection and forecasting of intratumoral evolution using quantitative imaging analysis is likely to be in the understanding of the synergistic interactions between observable intratumoral subregions and the resulting tumor behavior.

  4. A comparison of ancestral state reconstruction methods for quantitative characters.

    Science.gov (United States)

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-07

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  6. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  7. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  8. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  9. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  10. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  11. A quantitative method to measure and evaluate the peelability of shrimps (Pandalus borealis)

    DEFF Research Database (Denmark)

    Gringer, Nina; Dang, Tem Thi; Orlien, Vibeke

    2018-01-01

    A novel, standardized method has been developed in order to provide a quantitative description of shrimp peelability. The peeling process was based on the measure of the strength of the shell-muscle attachment of the shrimp using a texture analyzer, and calculated into the peeling work. The self......-consistent method, insensitive of the shrimp size, was proven valid for assessment of ice maturation of shrimps. The quantitative peeling efficiency (peeling work) and performance (degree of shell removal) showed that the decrease in peeling work correlated with the amount of satisfactory peeled shrimps, indicating...... an effective weakening of the shell-muscle attachment. The developed method provides the industry with a quantitative analysis for measurement of peeling efficiency and peeling performance of shrimps. It may be used for comparing different maturation conditions in relation to optimization of shrimps peeling....

  12. Biological characteristics of crucian by quantitative inspection method

    Science.gov (United States)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  13. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  14. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  15. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  16. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  17. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  18. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  19. Quantitative analysis of airway abnormalities in CT

    DEFF Research Database (Denmark)

    Petersen, Jens; Lo, Pechin Chien Pau; Nielsen, Mads

    2010-01-01

    A coupled surface graph cut algorithm for airway wall segmentation from Computed Tomography (CT) images is presented. Using cost functions that highlight both inner and outer wall borders, the method combines the search for both borders into one graph cut. The proposed method is evaluated on 173 ...

  20. A novel dual energy method for enhanced quantitative computed tomography

    Science.gov (United States)

    Emami, A.; Ghadiri, H.; Rahmim, A.; Ay, M. R.

    2018-01-01

    Accurate assessment of bone mineral density (BMD) is critically important in clinical practice, and conveniently enabled via quantitative computed tomography (QCT). Meanwhile, dual-energy QCT (DEQCT) enables enhanced detection of small changes in BMD relative to single-energy QCT (SEQCT). In the present study, we aimed to investigate the accuracy of QCT methods, with particular emphasis on a new dual-energy approach, in comparison to single-energy and conventional dual-energy techniques. We used a sinogram-based analytical CT simulator to model the complete chain of CT data acquisitions, and assessed performance of SEQCT and different DEQCT techniques in quantification of BMD. We demonstrate a 120% reduction in error when using a proposed dual-energy Simultaneous Equation by Constrained Least-squares method, enabling more accurate bone mineral measurements.

  1. Two quantitative forecasting methods for macroeconomic indicators in Czech Republic

    Directory of Open Access Journals (Sweden)

    Mihaela BRATU (SIMIONESCU

    2012-03-01

    Full Text Available Econometric modelling and exponential smoothing techniques are two quantitative forecasting methods with good results in practice, but the objective of the research was to find out which of the two techniques are better for short run predictions. Therefore, for inflation, unemployment and interest rate in Czech Republic some accuracy indicators were calculated for the predictions based on these methods. Short run forecasts on a horizon of 3 months were made for December 2011-February 2012, the econometric models being updated. For Czech Republic, the exponential smoothing techniques provided more accurate forecasts than the econometric models (VAR(2 models, ARMA procedure and models with lagged variables. One explication for the better performance of smoothing techniques would be that in the chosen countries the short run predictions more influenced by the recent evolution of the indicators.

  2. Implementing quantitative analysis and its complement

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Nelson, W.R.; Shepherd, J.C.

    1982-01-01

    This paper presents an application of risk analysis for the evaluation of nuclear reactor facility operation. Common cause failure analysis (CCFA) techniques to identify potential problem areas are discussed. Integration of CCFA and response trees, a particular form of the path sets of a success tree, to gain significant insight into the operation of the facility is also demonstrated. An example illustrating the development of the risk analysis methodology, development of the fault trees, generation of response trees, and evaluation of the CCFA is presented to explain the technique

  3. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  4. A direct method for estimating the alpha/beta ratio from quantitative dose-response data

    International Nuclear Information System (INIS)

    Stuschke, M.

    1989-01-01

    A one-step optimization method based on a least squares fit of the linear quadratic model to quantitative tissue response data after fractionated irradiation is proposed. Suitable end-points that can be analysed by this method are growth delay, host survival and quantitative biochemical or clinical laboratory data. The functional dependence between the transformed dose and the measured response is approximated by a polynomial. The method allows for the estimation of the alpha/beta ratio and its confidence limits from all observed responses of the different fractionation schedules. Censored data can be included in the analysis. A method to test the appropriateness of the fit is presented. A computer simulation illustrates the method and its accuracy as examplified by the growth delay end point. A comparison with a fit of the linear quadratic model to interpolated isoeffect doses shows the advantages of the direct method. (orig./HP) [de

  5. Validation of quantitative {sup 1}H NMR method for the analysis of pharmaceutical formulations; Validacao de metodo quantitativo por RMN de {sup 1}H para analises de formulacoes farmaceuticas

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maiara da S. [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Instituto de Quimica; Colnago, Luiz Alberto, E-mail: luiz.colnago@embrapa.br [Embrapa Instrumentacao, Sao Carlos, SP (Brazil)

    2013-09-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of Superscript-One H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  6. A novel semi-quantitative method for measuring tissue bleeding.

    Science.gov (United States)

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  7. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  8. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  9. Method for the quantitation of steroids in umbilical cord plasma

    International Nuclear Information System (INIS)

    Schindler, A.E.; Sparke, H.

    1975-01-01

    A method for simultaneous quantitation of nine steroids in cord plasma is described which consisted of Amberlite XAD-2 column chromatography at constant temperature of 45 degC, enzyme hydrolysis with β-glucoronidase/aryl sulfatase, addition of five radioactive internal standards, ethyl acetate extraction, thin-layer chromatography and quantitation by gas-liquid chromatography after trimethylsilyl ether derivative formation. Reliability criteria were established and the following steroid concentrations found: progesterone, 132.1+-102.5 μg/100 ml; pregnenolone, 57.3+-45.7 μg/100 ml; dehydroepiandrosterone, 46.5+-29.4 μg/100 ml; pregnanediol, 67.5+-46.6 μg/100 ml; 16-ketoandrostenediol, 19.8+-13.7 μg/100 ml; 16 α-hydroxydehydroepiandrosterone, 126.3+-86.9 μg/100 ml; 16 α-hydroxypregnenolone, 78.2+-56.5 μg/100 ml; androstenetriol, 22.2+-17.5 μg/100 ml and oestriol, 127.7+-116.9 μg/100 ml. (author)

  10. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  11. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  12. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  13. Quantitative gas analysis with FT-IR

    DEFF Research Database (Denmark)

    Bak, J.; Larsen, A.

    1995-01-01

    Calibration spectra of CO in the 2.38-5100 ppm concentration range (22 spectra) have been measured with a spectral resolution of 4 cm(-1), in the mid-IR (2186-2001 cm(-1)) region, with a Fourier transform infrared (FT-IR) instrument. The multivariate calibration method partial least-squares (PLS1...

  14. Photoacoustic image reconstruction: a quantitative analysis

    Science.gov (United States)

    Sperl, Jonathan I.; Zell, Karin; Menzenbach, Peter; Haisch, Christoph; Ketzer, Stephan; Marquart, Markus; Koenig, Hartmut; Vogel, Mika W.

    2007-07-01

    Photoacoustic imaging is a promising new way to generate unprecedented contrast in ultrasound diagnostic imaging. It differs from other medical imaging approaches, in that it provides spatially resolved information about optical absorption of targeted tissue structures. Because the data acquisition process deviates from standard clinical ultrasound, choice of the proper image reconstruction method is crucial for successful application of the technique. In the literature, multiple approaches have been advocated, and the purpose of this paper is to compare four reconstruction techniques. Thereby, we focused on resolution limits, stability, reconstruction speed, and SNR. We generated experimental and simulated data and reconstructed images of the pressure distribution using four different methods: delay-and-sum (DnS), circular backprojection (CBP), generalized 2D Hough transform (HTA), and Fourier transform (FTA). All methods were able to depict the point sources properly. DnS and CBP produce blurred images containing typical superposition artifacts. The HTA provides excellent SNR and allows a good point source separation. The FTA is the fastest and shows the best FWHM. In our study, we found the FTA to show the best overall performance. It allows a very fast and theoretically exact reconstruction. Only a hardware-implemented DnS might be faster and enable real-time imaging. A commercial system may also perform several methods to fully utilize the new contrast mechanism and guarantee optimal resolution and fidelity.

  15. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  16. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  17. Quantitative assessment of breast density: comparison of different methods

    International Nuclear Information System (INIS)

    Qin Naishan; Guo Li; Dang Yi; Song Luxin; Wang Xiaoying

    2011-01-01

    Objective: To Compare different methods of quantitative breast density measurement. Methods: The study included sixty patients who underwent both mammography and breast MRI. The breast density was computed automatically on digital mammograms with R2 workstation, Two experienced radiologists read the mammograms and assessed the breast density with Wolfe and ACR classification respectively. Fuzzy C-means clustering algorithm (FCM) was used to assess breast density on MRI. Each assessment method was repeated after 2 weeks. Spearman and Pearson correlations of inter- and intrareader and intermodality were computed for density estimates. Results: Inter- and intrareader correlation of Wolfe classification were 0.74 and 0.65, and they were 0.74 and 0.82 for ACR classification respectively. Correlation between Wolfe and ACR classification was 0.77. High interreader correlation of 0.98 and intrareader correlation of 0.96 was observed with MR FCM measurement. And the correlation between digital mammograms and MRI was high in the assessment of breast density (r=0.81, P<0.01). Conclusion: High correlation of breast density estimates on digital mammograms and MRI FCM suggested the former could be used as a simple and accurate method. (authors)

  18. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  19. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  20. Digital integrated protection system: Quantitative methods for dependability evaluation

    International Nuclear Information System (INIS)

    Krotoff, H.; Benski, C.

    1986-01-01

    The inclusion of programmed digital techniques in the SPIN system provides the used with the capability of performing sophisticated processing operations. However, it causes the quantitative evaluation of the overall failure probabilities to become somewhat more intricate by reason that: A single component may be involved in several functions; Self-tests may readily be incorporated for the purpose of monitoring the dependable operation of the equipment at all times. This paper describes the methods as implemented by MERLIN GERIN for the purpose of evaluating: The probabilities for the protective actions not to be initiated (dangerous failures); The probabilities for such protective actions to be initiated accidentally. Although the communication is focused on the programmed portion of the SPIN (UAIP) it will also deal with the evaluation performed within the scope of study works that do not exclusively cover the UAIPs

  1. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  2. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  3. Balancing the Quantitative and Qualitative Aspects of Social Network Analysis to Study Complex Social Systems

    OpenAIRE

    Schipper, Danny; Spekkink, Wouter

    2015-01-01

    Social Network Analysis (SNA) can be used to investigate complex social systems. SNA is typically applied as a quantitative method, which has important limitations. First, quantitative methods are capable of capturing the form of relationships (e.g. strength and frequency), but they are less suitable for capturing the content of relationships (e.g. interests and motivations). Second, while complex social systems are highly dynamic, the representations that SNA creates of such systems are ofte...

  4. A quantitative assessment method for the NPP operators' diagnosis of accidents

    International Nuclear Information System (INIS)

    Kim, M. C.; Seong, P. H.

    2003-01-01

    In this research, we developed a quantitative model for the operators' diagnosis of the accident situation when an accident occurs in a nuclear power plant. After identifying the occurrence probabilities of accidents, the unavailabilities of various information sources, and the causal relationship between accidents and information sources, Bayesian network is used for the analysis of the change in the occurrence probabilities of accidents as the operators receive the information related to the status of the plant. The developed method is applied to a simple example case and it turned out that the developed method is a systematic quantitative analysis method which can cope with complex relationship between the accidents and information sources and various variables such accident occurrence probabilities and unavailabilities of various information sources

  5. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  6. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  7. Influence of corrosion layers on quantitative analysis

    International Nuclear Information System (INIS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Roehrich, J.; Strub, E.

    2005-01-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed

  8. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  9. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  10. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com [Airbus Group Innovations, Munich (Germany); Grosse, Christian, E-mail: Grosse@tum.de [Technical University Munich (Germany)

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  11. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    International Nuclear Information System (INIS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-01-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented

  12. The discussion on the qualitative and quantitative evaluation methods for safety culture

    International Nuclear Information System (INIS)

    Gao Kefu

    2005-01-01

    The fundamental methods for safely culture evaluation are described. Combining with the practice of the quantitative evaluation of safety culture in Daya Bay NPP, the quantitative evaluation method for safety culture are discussed. (author)

  13. A Method for Quantitative Determination of Biofilm Viability

    Directory of Open Access Journals (Sweden)

    Maria Strømme

    2012-06-01

    Full Text Available In this study we present a scheme for quantitative determination of biofilm viability offering significant improvement over existing methods with metabolic assays. Existing metabolic assays for quantifying viable bacteria in biofilms usually utilize calibration curves derived from planktonic bacteria, which can introduce large errors due to significant differences in the metabolic and/or growth rates of biofilm bacteria in the assay media compared to their planktonic counterparts. In the presented method we derive the specific growth rate of Streptococcus mutans bacteria biofilm from a series of metabolic assays using the pH indicator phenol red, and show that this information could be used to more accurately quantify the relative number of viable bacteria in a biofilm. We found that the specific growth rate of S. mutans in biofilm mode of growth was 0.70 h−1, compared to 1.09 h−1 in planktonic growth. This method should be applicable to other bacteria types, as well as other metabolic assays, and, for example, to quantify the effect of antibacterial treatments or the performance of bactericidal implant surfaces.

  14. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  15. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  16. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  17. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    Science.gov (United States)

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  18. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  19. Análise quantitativa do tratamento da escoliose idiopática com o método klapp por meio da biofotogrametria computadorizada Quantitative photogrammetric analysis of the klapp method for treating idiopathic scoliosis

    Directory of Open Access Journals (Sweden)

    Denise H. Iunes

    2010-04-01

    Full Text Available INTRODUÇÃO: Poucos trabalhos comprovam a eficácia das técnicas fisioterapêuticas para o tratamento da escoliose. OBJETIVO: Analisar a eficácia do Método Klapp no tratamento das escolioses por meio do estudo quantitativo pela biofotogrametria computadorizada. MÉTODOS: Dezesseis indivíduos com média de idade de 15±2,61 anos, portadores de escoliose idiopática, foram tratados com o método Klapp. Para análise dos resultados do tratamento, todos foram fotografados antes e após o tratamento, seguindo uma padronização fotográfica. Todas as fotografias foram analisadas quantitativamente por um mesmo experimentador, utilizando o software ALCimagem 2000. A análise estatística foi realizada, utilizando-se a o teste-t pareado com nível de significância de 5%. RESULTADOS: Os resultados apontam para a melhora após o tratamento dos ângulos agromioclaviculares (AC-p=0,00 e esternoclavicular (EC-p=0,01, que avaliam a simetria dos ombros, e para o ângulo que avalia o triângulo de Tales esquerdo, (ΔTe-p=0,02. Em termos de flexibilidade, houve melhora dos ângulos tibiotársicos (ATT-p=0,01 e coxofemoral (CF-p=0,00. Não houve modificações das curvaturas vertebrais e nem melhora no posicionamento da cabeça, apenas na curvatura lombar, avaliada pelo ângulo lordose lombar (LL-p=0,00, sofreu modificação com o tratamento. CONCLUSÃO: O método Klapp foi uma técnica terapêutica eficaz para tratar as assimetrias de tronco e a flexibilidade. Não foi eficaz para assimetrias da pelve, modificações da posição da cabeça, da lordose cervical e cifose torácica.INTRODUCTION: Few studies have proved that physical therapy techniques are efficient in the treatment of scoliosis. OBJECTIVE: To analyze the efficiency of the Klapp method for the treatment of scoliosis, through a quantitative analysis using computerized biophotogrammetry. METHODS: Sixteen participants of a mean age of 15±2.61 yrs. with idiopathic scoliosis were treated using

  20. Quantitative spectrographic analysis of impurities in antimonium

    International Nuclear Information System (INIS)

    Brito, J. de; Gomes, R.P.

    1978-01-01

    An emission spectrographic method is describe for the determination of Ag, Al, As, Be, Bi, Cd, Cr, Cu, Ga, Ni, Pb, Sn, Si, and Zn in high purity antimony metal. The metal sample ia dissolved in nitric acid(1:1) and converted tp oxide by calcination at 900 0 C for one hour. The oxide so obtained is mixed with graphite, which is used as a spectroscopic buffer, and excited by a direct current arc. Many parameters are studied optimum conditions are selected for the determination of the impurities mentioned. The spectrum is photographed in the second order of a 15.000 lines per inch grating and the most sensitive lines for the elements are selected. The impurities are determined in the concentration range of 1 - 0,01% with a precision of approximately 10% [pt

  1. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  2. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  3. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    Science.gov (United States)

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  4. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  5. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  6. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  7. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging

  8. Instrumentation and quantitative methods of evaluation. Progress report, January 15-September 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.

    1986-09-01

    This document reports progress under grant entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Individual reports are presented on projects entitled the physical aspects of radionuclide imaging, image reconstruction and quantitative evaluation, PET-related instrumentation for improved quantitation, improvements in the FMI cyclotron for increased utilization, and methodology for quantitative evaluation of diagnostic performance

  9. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    Science.gov (United States)

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] clobenzorex.

  10. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  11. 4D-SPECT/CT in orthopaedics: a new method of combined quantitative volumetric 3D analysis of SPECT/CT tracer uptake and component position measurements in patients after total knee arthroplasty

    Energy Technology Data Exchange (ETDEWEB)

    Rasch, Helmut; Falkowski, Anna L.; Forrer, Flavio [Kantonsspital Baselland, Institute for Radiology and Nuclear Medicine, Bruderholz (Switzerland); Henckel, Johann [Imperial College London, London (United Kingdom); Hirschmann, Michael T. [Kantonsspital Baselland, Department of Orthopaedic Surgery and Traumatology, Bruderholz (Switzerland)

    2013-09-15

    The purpose was to evaluate the intra- and inter-observer reliability of combined quantitative 3D-volumetric single-photon emission computed tomography (SPECT)/CT analysis including size, intensity and localisation of tracer uptake regions and total knee arthroplasty (TKA) position. Tc-99m-HDP-SPECT/CT of 100 knees after TKA were prospectively analysed. The anatomical areas represented by a previously validated localisation scheme were 3D-volumetrically analysed. The maximum intensity was recorded for each anatomical area. Ratios between the respective value and the mid-shaft of the femur as the reference were calculated. Femoral and tibial TKA position (varus-valgus, flexion-extension, internal rotation- external rotation) were determined on 3D-CT. Two consultant radiologists/nuclear medicine physicians interpreted the SPECT/CTs twice with a 2-week interval. The inter- and intra-observer reliability was determined (ICCs). Kappa values were calculated for the area with the highest tracer uptake between the observers. The measurements of tracer uptake intensity showed excellent inter- and intra-observer reliabilities for all regions (tibia, femur and patella). Only the tibial shaft area showed ICCs <0.89. The kappa values were almost perfect (0.856, p < 0.001; 95 % CI 0.778, 0.922). For measurements of the TKA position, there was strong agreement within and between the readings of the two observers; the ICCs for the orientation of TKA components for inter- and intra-observer reliability were nearly perfect (ICCs >0.84). This combined 3D-volumetric standardised method of analysing the location, size and the intensity of SPECT/CT tracer uptake regions (''hotspots'') and the determination of the TKA position was highly reliable and represents a novel promising approach to biomechanics. (orig.)

  12. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Directory of Open Access Journals (Sweden)

    Melanie I Stefan

    2015-04-01

    Full Text Available The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014 show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  13. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Science.gov (United States)

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  14. Stochastic resonance is applied to quantitative analysis for weak chromatographic signal of glyburide in plasma

    International Nuclear Information System (INIS)

    Zhang Wei; Xiang Bingren; Wu Yanwei; Shang Erxin

    2005-01-01

    Based on the theory of stochastic resonance, a new method carried on the quantitive analysis to weak chromatographic signal of glyburide in plasma, which was embedded in the noise background and the signal-to-noise ratio (SNR) of HPLC-UV is enhanced remarkably. This method enhances the quantification limit to 1 ng ml -1 , which is the same as HPLC-MS, and makes it possible to detect the weak signal accurately by HPLC-UV, which was not suitable before. The results showed good recovery and linear range from 1 to 50 ng ml -1 of glyburide in plasma and the method can be used for quantitative analysis of glyburide

  15. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  16. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  17. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  18. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  19. Quantitative and regional evaluation methods for lung scintigraphs

    International Nuclear Information System (INIS)

    Fichter, J.

    1982-01-01

    For the evaluation of perfusion lung scintigraphs with regard to the quantitative valuation and also with regard to the choice of the regions new criteria were presented. In addition to the usual methods of sectioning each lung lobe into upper, middle and lower level and the determination of the per cent activity share of the total activity the following values were established: the median of the activity distribution and the differences of the per cent counting rate as well as of the median of the corresponding regions of the right and left lung. The individual regions should describe the functional structures (lobe and segment structure). A corresponding computer program takes over the projection of lobe and segment regions in a simplified form onto the scintigraph with consideration of individual lung stretching. With the help of a clinical study on 60 patients and 18 control persons with 99mTc-MAA and 133 Xe-gas lung scintigraphs the following results could be determined: depending on the combination of the 32 parameters available for evaluation and the choice of regions between 4 and 20 of the 60 patients were falsely negatively classified and 1 to 2 of the 18 controls were falsely positive. The accuracy of the Tc-scintigraph proved to be better. All together using the best possible parameter combinations comparative results were attained. (TRV) [de

  20. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets