WorldWideScience

Sample records for method shows quantitative

  1. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  2. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  3. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  4. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  5. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  6. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  7. A CT-based method for fully quantitative TI SPECT

    International Nuclear Information System (INIS)

    Willowson, Kathy; Bailey, Dale; Baldock, Clive

    2009-01-01

    Full text: Objectives: To develop and validate a method for quantitative 2 0 l TI SPECT data based on corrections derived from X-ray CT data, and to apply the method in the clinic for quantitative determination of recurrence of brain tumours. Method: A previously developed method for achieving quantitative SPECT with 9 9 m Tc based on corrections derived from xray CT data was extended to apply to 2 0 l Tl. Experimental validation was performed on a cylindrical phantom by comparing known injected activity and measured concentration to quantitative calculations. Further evaluation was performed on a RSI Striatal Brain Phantom containing three 'lesions' with activity to background ratios of 1: 1, 1.5: I and 2: I. The method was subsequently applied to a series of scans from patients with suspected recurrence of brain tumours (principally glioma) to determine an SUV-like measure (Standardised Uptake Value). Results: The total activity and concentration in the phantom were calculated to within 3% and I % of the true values, respectively. The calculated values for the concentration of activity in the background and corresponding lesions of the brain phantom (in increasing ratios) were found to be within 2%,10%,1% and 2%, respectively, of the true concentrations. Patient studies showed that an initial SUV greater than 1.5 corresponded to a 56% mortality rate in the first 12 months, as opposed to a 14% mortality rate for those with a SUV less than 1.5. Conclusion: The quantitative technique produces accurate results for the radionuclide 2 0 l Tl. Initial investigation in clinical brain SPECT suggests correlation between quantitative uptake and survival.

  8. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  9. Quantitative analysis of iodine in thyroidin. I. Methods of ''dry'' and ''wet'' mineralization

    International Nuclear Information System (INIS)

    Listov, S.A.; Arzamastsev, A.P.

    1986-01-01

    The relative investigations on the quantitative determination of iodine in thyroidin using different modifications of the ''dry'' and ''wet'' mineralization show that in using these methods the difficulties due to the characteristic features of the object of investigation itself and the mineralization method as a whole must be taken into account. The studies show that the most applicable method for the analysis of thyroidin is the method of ''dry'' mineralization with potassium carbonate. A procedure is proposed for a quantitative determination of iodine in thyroidin

  10. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  12. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  13. Quantitative Methods in Public Administration: their use and development through time

    NARCIS (Netherlands)

    Groeneveld, S.M.; Tummers, L.G.; Bronkhorst, B.A.C.; Ashikali, T.S.; van Thiel, S.

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001-2010 in four leading journals: JPART, PAR, Governance and PA. Results show

  14. A general method for bead-enhanced quantitation by flow cytometry

    Science.gov (United States)

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  15. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  16. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  17. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  18. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  19. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  20. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  1. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  2. Quantitative evaluation methods of skin condition based on texture feature parameters

    Directory of Open Access Journals (Sweden)

    Hui Pang

    2017-03-01

    Full Text Available In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  3. Quantitative evaluation methods of skin condition based on texture feature parameters.

    Science.gov (United States)

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A quantitative method to measure and evaluate the peelability of shrimps (Pandalus borealis)

    DEFF Research Database (Denmark)

    Gringer, Nina; Dang, Tem Thi; Orlien, Vibeke

    2018-01-01

    A novel, standardized method has been developed in order to provide a quantitative description of shrimp peelability. The peeling process was based on the measure of the strength of the shell-muscle attachment of the shrimp using a texture analyzer, and calculated into the peeling work. The self......-consistent method, insensitive of the shrimp size, was proven valid for assessment of ice maturation of shrimps. The quantitative peeling efficiency (peeling work) and performance (degree of shell removal) showed that the decrease in peeling work correlated with the amount of satisfactory peeled shrimps, indicating...... an effective weakening of the shell-muscle attachment. The developed method provides the industry with a quantitative analysis for measurement of peeling efficiency and peeling performance of shrimps. It may be used for comparing different maturation conditions in relation to optimization of shrimps peeling....

  7. Quantitative method for determination of body inorganic iodine

    International Nuclear Information System (INIS)

    Filatov, A.A.; Tatsievskij, V.A.

    1991-01-01

    An original method of quantitation of body inorganic iodine, based upon a simultaneous administration of a known dose of stable and radioactive iodine with subsequent radiometry of the thyroid was proposed. The calculation is based upon the principle of the dilution of radiactive iodine in human inorganic iodine space. The method permits quantitation of the amount of inorganic iodine with regard to individual features of inorganic space. The method is characterized by simplicity and is not invasive for a patient

  8. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    Science.gov (United States)

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  9. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  10. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  11. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  12. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging

  13. A novel method for quantitative geosteering using azimuthal gamma-ray logging

    International Nuclear Information System (INIS)

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-01-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. - Highlights: • A new method is proposed for geosteering by using azimuthal gamma-ray logging. • The new method can quantitatively determine the distance from the drilling bit to the boundary surface while the traditional geosteering method can only qualitatively guide the drilling bit in reservoirs. • The response points of real-time upper and lower gamma line when the logging tool meets high radioactive formation are used to predict the relative dip angles, and then the distance from the drilling bit to the boundary surface is calculated

  14. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  15. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  16. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    Science.gov (United States)

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  17. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  18. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  19. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  20. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  1. A Quantitative Method for Localizing User Interface Problems: The D-TEO Method

    Directory of Open Access Journals (Sweden)

    Juha Lamminen

    2009-01-01

    Full Text Available A large array of evaluation methods have been proposed to identify Website usability problems. In log-based evaluation, information about the performance of users is collected and stored into log files, and used to find problems and deficiencies in Web page designs. Most methods require the programming and modeling of large task models, which are cumbersome processes for evaluators. Also, because much statistical data is collected onto log files, recognizing which Web pages require deeper usability analysis is difficult. This paper suggests a novel quantitative method, called the D-TEO, for locating problematic Web pages. This semiautomated method explores the decomposition of interaction tasks of directed information search into elementary operations, deploying two quantitative usability criteria, search success and search time, to reveal how a user navigates within a web of hypertext.

  2. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.

  3. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    Science.gov (United States)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  4. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative ... 2. Basic principles. The mineralogical constitution of soil is rather complex. ... K2O, MgO, and TFe as variables for the calculation.

  5. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    NARCIS (Netherlands)

    Bosschaart, Nienke; van Leeuwen, Ton; Aalders, Maurice C.G.; Faber, Dirk

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of

  6. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  7. A direct method for estimating the alpha/beta ratio from quantitative dose-response data

    International Nuclear Information System (INIS)

    Stuschke, M.

    1989-01-01

    A one-step optimization method based on a least squares fit of the linear quadratic model to quantitative tissue response data after fractionated irradiation is proposed. Suitable end-points that can be analysed by this method are growth delay, host survival and quantitative biochemical or clinical laboratory data. The functional dependence between the transformed dose and the measured response is approximated by a polynomial. The method allows for the estimation of the alpha/beta ratio and its confidence limits from all observed responses of the different fractionation schedules. Censored data can be included in the analysis. A method to test the appropriateness of the fit is presented. A computer simulation illustrates the method and its accuracy as examplified by the growth delay end point. A comparison with a fit of the linear quadratic model to interpolated isoeffect doses shows the advantages of the direct method. (orig./HP) [de

  8. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    Science.gov (United States)

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  9. [A new method of processing quantitative PCR data].

    Science.gov (United States)

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  10. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...

  11. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    Science.gov (United States)

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This

  12. Quantitative methods in electroencephalography to access therapeutic response.

    Science.gov (United States)

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  13. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody

    Energy Technology Data Exchange (ETDEWEB)

    Yoshinari, Tomoya [Division of Microbiology, National Institute of Health Sciences, 1-18-1, Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena [Biomedical Division, Ushio Inc., 1-12 Minamiwatarida-cho, Kawasaki-ku, Kawasaki 210-0855 (Japan); Ohkawa, Hideo [Research Center for Environmental Genomics, Kobe University, 1-1 Rokkodai, Nada, Kobe 657-8501 (Japan); Sugita-Konishi, Yoshiko, E-mail: y-konishi@azabu-u.ac.jp [Department of Food and Life Science, Azabu University, 1-17-71 Fuchinobe, Chuo-ku, Sagamihara, Kanagawa 252-5201 (Japan)

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L{sup −1}. The coefficients of variation were 7.9% at 0.003 mg L{sup −1}, 5.0% at 0.03 mg L{sup −1} and 13.7% at 0.3 mg L{sup −1}, respectively. The limit of detection was 0.006 mg L{sup −1} for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9–100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg{sup −1}. The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R{sup 2} = 0.9760) than the immunochromatographic assay kit (R{sup 2} = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. - Highlights: • A rapid method for quantitation of DON using Q-body has been developed. • A recovery test using the anti-DON Q-body was performed. • The concentrations of DON in wheat

  14. Validation of quantitative 1H NMR method for the analysis of pharmaceutical formulations

    International Nuclear Information System (INIS)

    Santos, Maiara da S.

    2013-01-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  15. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    Science.gov (United States)

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  17. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. Quantitative Preparation in Doctoral Education Programs: A Mixed-Methods Study of Doctoral Student Perspectives on their Quantitative Training

    Directory of Open Access Journals (Sweden)

    Sarah L Ferguson

    2017-07-01

    Full Text Available Aim/Purpose: The purpose of the current study is to explore student perceptions of their own doctoral-level education and quantitative proficiency. Background: The challenges of preparing doctoral students in education have been discussed in the literature, but largely from the perspective of university faculty and program administrators. The current study directly explores the student voice on this issue. Methodology: Utilizing a sequential explanatory mixed-methods research design, the present study seeks to better understand doctoral-level education students’ perceptions of their quantitative methods training at a large public university in the southwestern United States. Findings: Results from both phases present the need for more application and consistency in doctoral-level quantitative courses. Additionally, there was a consistent theme of internal motivation in the responses, suggesting students perceive their quantitative training to be valuable beyond their personal interest in the topic. Recommendations for Practitioners: Quantitative methods instructors should emphasize practice in their quantitative courses and consider providing additional support for students through the inclusion of lab sections, tutoring, and/or differentiation. Pre-testing statistical ability at the start of a course is also suggested to better meet student needs. Impact on Society: The ultimate goal of quantitative methods in doctoral education is to produce high-quality educational researchers who are prepared to apply their knowledge to problems and research in education. Results of the present study can inform faculty and administrator decisions in doctoral education to best support this goal. Future Research: Using the student perspectives presented in the present study, future researchers should continue to explore effective instructional strategies and curriculum design within education doctoral programs. The inclusion of student voice can strengthen

  20. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  1. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  2. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  3. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  4. Calibration of quantitative neutron radiography method for moisture measurement

    International Nuclear Information System (INIS)

    Nemec, T.; Jeraj, R.

    1999-01-01

    Quantitative measurements of moisture and hydrogenous matter in building materials by neutron radiography (NR) are regularly performed at TRIGA Mark II research of 'Jozef Stefan' Institute in Ljubljana. Calibration of quantitative method is performed using standard brick samples with known moisture content and also with a secondary standard, plexiglas step wedge. In general, the contribution of scattered neutrons to the neutron image is not determined explicitly what introduces an error to the measured signal. Influence of scattered neutrons is significant in regions with high gradients of moisture concentrations, where the build up of scattered neutrons causes distortion of the moisture concentration profile. In this paper detailed analysis of validity of our calibration method for different geometrical parameters is presented. The error in the measured hydrogen concentration is evaluated by an experiment and compared with results obtained by Monte Carlo calculation with computer code MCNP 4B. Optimal conditions are determined for quantitative moisture measurements in order to minimize the error due to scattered neutrons. The method is tested on concrete samples with high moisture content.(author)

  5. The discussion on the qualitative and quantitative evaluation methods for safety culture

    International Nuclear Information System (INIS)

    Gao Kefu

    2005-01-01

    The fundamental methods for safely culture evaluation are described. Combining with the practice of the quantitative evaluation of safety culture in Daya Bay NPP, the quantitative evaluation method for safety culture are discussed. (author)

  6. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  7. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  8. Quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    Energy Technology Data Exchange (ETDEWEB)

    Miyanaga, Hajime [Kyoto Prefectural Univ. of Medicine (Japan)

    1982-08-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes.

  9. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    Science.gov (United States)

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] clobenzorex.

  10. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  11. A comparison of ancestral state reconstruction methods for quantitative characters.

    Science.gov (United States)

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-07

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    International Nuclear Information System (INIS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-01-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM. (paper)

  13. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    Science.gov (United States)

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  14. The method of quantitative X-ray microanalysis of fine inclusions in copper

    International Nuclear Information System (INIS)

    Morawiec, H.; Kubica, L.; Piszczek, J.

    1978-01-01

    The method of correction for the matrix effect in quantitative x-ray microanalysis was presented. The application of the method was discussed on the example of quantitative analysis of fine inclusions of Cu 2 S and Cu 2 O in copper. (author)

  15. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  16. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  17. A Method for Quantitative Determination of Biofilm Viability

    Directory of Open Access Journals (Sweden)

    Maria Strømme

    2012-06-01

    Full Text Available In this study we present a scheme for quantitative determination of biofilm viability offering significant improvement over existing methods with metabolic assays. Existing metabolic assays for quantifying viable bacteria in biofilms usually utilize calibration curves derived from planktonic bacteria, which can introduce large errors due to significant differences in the metabolic and/or growth rates of biofilm bacteria in the assay media compared to their planktonic counterparts. In the presented method we derive the specific growth rate of Streptococcus mutans bacteria biofilm from a series of metabolic assays using the pH indicator phenol red, and show that this information could be used to more accurately quantify the relative number of viable bacteria in a biofilm. We found that the specific growth rate of S. mutans in biofilm mode of growth was 0.70 h−1, compared to 1.09 h−1 in planktonic growth. This method should be applicable to other bacteria types, as well as other metabolic assays, and, for example, to quantify the effect of antibacterial treatments or the performance of bactericidal implant surfaces.

  18. Industrial ecology: Quantitative methods for exploring a lower carbon future

    Science.gov (United States)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  19. A quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    International Nuclear Information System (INIS)

    Miyanaga, Hajime

    1982-01-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes. (J.P.N.)

  20. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    Science.gov (United States)

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  2. Cross-method validation as a solution to the problem of excessive simplification of measurement in quantitative IR research

    DEFF Research Database (Denmark)

    Beach, Derek

    2007-01-01

    The purpose of this article is to make IR scholars more aware of the costs of choosing quantitative methods. The article first shows that quantification can have analytical ‘costs’ when the measures created are too simple to capture the essence of the systematized concept that was supposed...... detail based upon a review of the democratic peace literature. I then offer two positive suggestions for a way forward. First, I argue that quantitative scholars should spend more time validating their measures, and in particular should engage in multi-method partnerships with qualitative scholars...... that have a deep understanding of particular cases in order to exploit the comparative advantages of qualitative methodology, using the more accurate qualitative measures to validate their own quantitative measures. Secondly, quantitative scholars should lower their level of ambition given the often poor...

  3. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  4. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  5. New 'ex vivo' radioisotopic method of quantitation of platelet deposition

    International Nuclear Information System (INIS)

    Badimon, L.; Mayo Clinic, Rochester, MN; Thrombosis and Atherosclerosis Unit, Barcelona; Mayo Clinic, Rochester, MN; Fuster, V.; Chesebro, J.H.; Dewanjee, M.K.

    1983-01-01

    We have developed a sensitive and quantitative method of 'ex vivo' evaluation of platelet deposition on collagen strips, from rabbit Achilles tendon, superfused by flowing blood and applied it to four animal species, cat, rabbit, dog and pig. Autologous platelets were labeled with indium-111-tropolone, injected to the animal 24 hr before the superfusion and the number of deposited platelets was quantitated from the tendon gamma-radiation and the blood platelet count. We detected some platelet consumption with superfusion time when blood was reinfused entering the contralateral jugular vein after collagen contact but not if blood was discarded after the contact. Therefore, in order to have a more physiological animal model we decided to discard blood after superfusion of the tendon. In all species except for the cat there was a linear relationship between increase of platelet on the tendon and time of exposure to blood superfusion. The highest number of platelets deposited on the collagen was found in cats, the lowest in dogs. Ultrastructural analysis showed the platelets were deposited as aggregates after only 5 min of superfusion. (orig.)

  6. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  7. A single qualitative study can show same findings as years of quantitative research: Obstructive sleep apnoea as an example

    Directory of Open Access Journals (Sweden)

    Howard Tandeter

    2016-06-01

    Full Text Available Background Many years of quantitative research led to our present knowledge of the symptoms and associated features (S&AF of the obstructive sleep apnoea (OSA syndrome. Aims 1. To prove that a qualitative research approach may identify symptoms and associated features of OSA in less time/effort than that used in a quantitative approach; 2. To describe the experience of patients with OSA and the effects of the syndrome on their quality of life and that of their spouses and families (issues that quantitative methods fail to recognize. Methods We used a narrative inquiry methodology (qualitative research. The sample was selected using the “snowball sampling technique". The sample included 10 patients with moderate to severe OSA who had good adherence to CPAP and significant clinical improvement after treatment, and 3 of the patient’s spouses. Results The following issues were identified: A long pre-diagnosis phase of OSA (20 years in one of the patients; Characteristic S&AF of the syndrome as experienced by patients and their spouses; The need for increased awareness of both the public and the medical establishment in regards to this disorder; Premature ejaculation (not reported previously and nightmares (non-conclusive in the literature were identified and improved with CPAP therapy. Conclusion With the use of quantitative research methods it took decades to discover things that we found in one simple qualitative study. We therefore urge scientists to use more often these qualitative methods when looking for S&AF of diseases and syndromes.

  8. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    Science.gov (United States)

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The

  9. Can qualitative and quantitative methods serve complementary purposes for policy research?

    OpenAIRE

    Maxwell, Daniel G.

    1998-01-01

    Qualitative and quantitative methods in social science research have long been separate spheres with little overlap. However, recent innovations have highlighted the complementarity of qualitative and quantitative approaches. The Accra Food and Nutrition Security Study was designed to incorporate the participation of a variety of constituencies in the research, and to rely on a variety of approaches — both qualitative and quantitative — to data collection and analysis. This paper reviews the ...

  10. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  11. Instrumentation and quantitative methods of evaluation. Progress report, January 15-September 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.

    1986-09-01

    This document reports progress under grant entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Individual reports are presented on projects entitled the physical aspects of radionuclide imaging, image reconstruction and quantitative evaluation, PET-related instrumentation for improved quantitation, improvements in the FMI cyclotron for increased utilization, and methodology for quantitative evaluation of diagnostic performance

  12. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  13. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  14. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  15. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    Science.gov (United States)

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  16. Embedding Quantitative Methods by Stealth in Political Science: Developing a Pedagogy for Psephology

    Science.gov (United States)

    Gunn, Andrew

    2017-01-01

    Student evaluations of quantitative methods courses in political science often reveal they are characterised by aversion, alienation and anxiety. As a solution to this problem, this paper describes a pedagogic research project with the aim of embedding quantitative methods by stealth into the first-year undergraduate curriculum. This paper…

  17. Effects of ROI definition and reconstruction method on quantitative outcome and applicability in a response monitoring trial

    International Nuclear Information System (INIS)

    Krak, Nanda C.; Boellaard, R.; Hoekstra, Otto S.; Hoekstra, Corneline J.; Twisk, Jos W.R.; Lammertsma, Adriaan A.

    2005-01-01

    Quantitative measurement of tracer uptake in a tumour can be influenced by a number of factors, including the method of defining regions of interest (ROIs) and the reconstruction parameters used. The main purpose of this study was to determine the effects of different ROI methods on quantitative outcome, using two reconstruction methods and the standard uptake value (SUV) as a simple quantitative measure of FDG uptake. Four commonly used methods of ROI definition (manual placement, fixed dimensions, threshold based and maximum pixel value) were used to calculate SUV (SUV [MAN] , SUV 15 mm , SUV 50 , SUV 75 and SUV max , respectively) and to generate ''metabolic'' tumour volumes. Test-retest reproducibility of SUVs and of ''metabolic'' tumour volumes and the applicability of ROI methods during chemotherapy were assessed. In addition, SUVs calculated on ordered subsets expectation maximisation (OSEM) and filtered back-projection (FBP) images were compared. ROI definition had a direct effect on quantitative outcome. On average, SUV [MAN] , SUV 15 mm , SUV 50 and SUV 75 , were respectively 48%, 27%, 34% and 15% lower than SUV max when calculated on OSEM images. No statistically significant differences were found between SUVs calculated on OSEM and FBP reconstructed images. Highest reproducibility was found for SUV 15 mm and SUV [MAN] (ICC 0.95 and 0.94, respectively) and for ''metabolic'' volumes measured with the manual and 50% threshold ROIs (ICC 0.99 for both). Manual, 75% threshold and maximum pixel ROIs could be used throughout therapy, regardless of changes in tumour uptake or geometry. SUVs showed the same trend in relative change in FDG uptake after chemotherapy, irrespective of the ROI method used. The method of ROI definition has a direct influence on quantitative outcome. In terms of simplicity, user-independence, reproducibility and general applicability the threshold-based and fixed dimension methods are the best ROI methods. Threshold methods are in

  18. Simple PVT quantitative method of Kr under high pure N2 condition

    International Nuclear Information System (INIS)

    Li Xuesong; Zhang Zibin; Wei Guanyi; Chen Liyun; Zhai Lihua

    2005-01-01

    A simple PVT quantitative method of Kr in the high pure N 2 was studied. Pressure, volume and temperature of the sample gas were measured by three individual methods to obtain the sum sample with food uncertainty. The ratio of Kr/N 2 could measured by GAM 400 quadrupole mass spectrometer. So the quantity of Kr could be calculated with the two measured data above. This method can be suited for quantitative analysis of other simple composed noble gas sample with high pure carrying gas. (authors)

  19. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  20. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  1. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  2. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...

  3. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  4. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  5. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors

    International Nuclear Information System (INIS)

    Cox, Annie M.; Goodwin, Kelly D.

    2013-01-01

    Highlights: • DNA extraction methods affected measured qPCR target recovery. • Recovery and variability differed, sometimes by more than an order of magnitude. • SCODA did not offer significant improvement with PCR-inhibited seawater. • Aggressive lysis did appear to improve target recovery. • Reliable and affordable correction methods are needed for quantitative PCR. -- Abstract: The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications

  6. Study of the quantitative analysis approach of maintenance by the Monte Carlo simulation method

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    2007-01-01

    This study is examination of the quantitative valuation by Monte Carlo simulation method of maintenance activities of a nuclear power plant. Therefore, the concept of the quantitative valuation of maintenance that examination was advanced in the Japan Society of Maintenology and International Institute of Universality (IUU) was arranged. Basis examination for quantitative valuation of maintenance was carried out at simple feed water system, by Monte Carlo simulation method. (author)

  7. A quantitative method to evaluate mesenchymal stem cell lipofection using real-time PCR.

    Science.gov (United States)

    Ribeiro, S C; Mendes, R; Madeira, C; Monteiro, G A; da Silva, C L; Cabral, J M S

    2010-01-01

    Genetic modification of human mesenchymal stem cells (MSC) is a powerful tool to improve the therapeutic utility of these cells and to increase the knowledge on their regulation mechanisms. In this context, strong efforts have been made recently to develop efficient nonviral gene delivery systems. Although several studies addressed this question most of them use the end product of a reporter gene instead of the DNA uptake quantification to test the transfection efficiency. In this study, we established a method based on quantitative real-time PCR (RT-PCR) to determine the intracellular plasmid DNA copy number in human MSC after lipofection. The procedure requires neither specific cell lysis nor DNA purification. The influence of cell number on the RT-PCR sensitivity was evaluated. The method showed good reproducibility, high sensitivity, and a wide linear range of 75-2.5 x 10⁶ plasmid DNA copies per cell. RT-PCR results were then compared with the percentage of transfected cells assessed by flow cytometry analysis, which showed that flow cytometry-based results are not always proportional to plasmid cellular uptake determined by RT-PCR. This work contributed for the establishment of a rapid quantitative assay to determine intracellular plasmid DNA in stem cells, which will be extremely beneficial for the optimization of gene delivery strategies. © 2010 American Institute of Chemical Engineers

  8. An improved method for quantitative magneto-optical analysis of superconductors

    International Nuclear Information System (INIS)

    Laviano, F; Botta, D; Chiodoni, A; Gerbaldo, R; Ghigo, G; Gozzelino, L; Zannella, S; Mezzetti, E

    2003-01-01

    We report on the analysis method to extract quantitative local electrodynamics in superconductors by means of the magneto-optical technique. First of all, we discuss the calibration procedure to convert the local light intensity values into magnetic induction field distribution and start focusing on the role played by the generally disregarded magnetic induction components parallel to the indicator film plane (in-plane field effect). To account for the reliability of the whole technique, the method used to reconstruct the electrical current density distribution is reported, together with a numerical test example. The methodology is applied to measure local magnetic field and current distributions on a typical YBa 2 Cu 3 O 7-x good quality film. We show how the in-plane field influences the MO measurements, after which we present an algorithm to account for the in-plane field components. The meaningful impact of the correction on the experimental results is shown. Afterwards, we discuss some aspects about the electrodynamics of the superconducting sample

  9. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  10. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    Science.gov (United States)

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  11. DREAM: a method for semi-quantitative dermal exposure assessment

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Brouwer, D.H.; Kromhout, H.; Hemmen, J.J. van

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others,

  12. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  13. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    pages of fault trees which should be redrawn from the logical relation between the components in the DPPS. On the other hand, the RGGG model for DPPS can be drawn in only I page, and the structure of the model is almost similar to the actual structure of DPPS. In addition, the RGGG model visually shows the state of information processed by each component. In this sense, I believe that the RGGG method is more intuitive and easy to use. Quantitative analysis of the fault tree model and the RGGG model shows that the two models produce equivalent results. Currently, an identified disadvantage is the calculation time, since a lot of approximation algorithms are already developed for the fault tree analysis, but not for the RGGG method. As a new method for HRA, I develop a quantitative situation assessment model for human operators, since human performance is mainly affected by the situation assessment. In contrast to the conventional HRA methods which are mostly developed by expert opinions, the proposed situation assessment model for human operators is developed on the basis of mathematical theories, Bayesian inference and the information theory, with the following two assumptions. 1. Human operators can do Bayesian inference, even though the results cannot be as accurate as mathematical calculations. 2. In knowledge-driven monitoring, the probability that human operators select an indicator as the next indicator to monitor is proportional to the expected information from the indicator. (The expected information from each indicator can be calculated using the information theory.) With an experiment, it is shown that the two assumptions are reasonable. The proposed mathematical model for the situation assessment of human operators is expected to be used as the basis for the development of the quantitative model for the situation assessment of actual human operators. By combining the RGGG method and the mathematical model for the situation assessment of human operators, I

  14. A quantitative assessment method for the NPP operators' diagnosis of accidents

    International Nuclear Information System (INIS)

    Kim, M. C.; Seong, P. H.

    2003-01-01

    In this research, we developed a quantitative model for the operators' diagnosis of the accident situation when an accident occurs in a nuclear power plant. After identifying the occurrence probabilities of accidents, the unavailabilities of various information sources, and the causal relationship between accidents and information sources, Bayesian network is used for the analysis of the change in the occurrence probabilities of accidents as the operators receive the information related to the status of the plant. The developed method is applied to a simple example case and it turned out that the developed method is a systematic quantitative analysis method which can cope with complex relationship between the accidents and information sources and various variables such accident occurrence probabilities and unavailabilities of various information sources

  15. On the use of quantitative methods in the Danish food industry

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Østergaard, Peder; Kristensen, Kai

    1997-01-01

    Executive summary 1. The paper examines the use of quantitative methods in the Danish food industry and a comparison is made between the food industry and other manufacturing industries. Data was collected in 1991 and 107 manufacturing companies filled in the questionnaire. 20 of the companies were...... orientation is expected to lead to a more intensive use of proactive methods. It will be obvious to compare results from the new investigation with the results presented in this report in order to identify any trends in the use of quantitative methods....... in this paper does not lead to any striking differences between food companies and other manufacturing companies. In both cases there is a heavy concentration on methods used to analyze internal processes. 4. The increasing focus on food products ready for consumption and the general increase in focus on market...

  16. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    Science.gov (United States)

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  17. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    Science.gov (United States)

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (Pmethods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    International Nuclear Information System (INIS)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline

  19. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    Energy Technology Data Exchange (ETDEWEB)

    Tam, Allison [Stanford Institutes of Medical Research Program, Stanford University School of Medicine, Stanford, California 94305 (United States); Barker, Jocelyn [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 (United States); Rubin, Daniel [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 and Department of Medicine (Biomedical Informatics Research), Stanford University School of Medicine, Stanford, California 94305 (United States)

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  20. An improved fast neutron radiography quantitative measurement method

    International Nuclear Information System (INIS)

    Matsubayashi, Masahito; Hibiki, Takashi; Mishima, Kaichiro; Yoshii, Koji; Okamoto, Koji

    2004-01-01

    The validity of a fast neutron radiography quantification method, the Σ-scaling method, which was originally proposed for thermal neutron radiography was examined with Monte Carlo calculations and experiments conducted at the YAYOI fast neutron source reactor. Water and copper were selected as comparative samples for a thermal neutron radiography case and a dense object, respectively. Although different characteristics on effective macroscopic cross-sections were implied by the simulation, the Σ-scaled experimental results with the fission neutron spectrum cross-sections were well fitted to the measurements for both the water and copper samples. This indicates that the Σ-scaling method could be successfully adopted for quantitative measurements in fast neutron radiography

  1. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  2. Quantitative numerical method for analysing slip traces observed by AFM

    International Nuclear Information System (INIS)

    Veselý, J; Cieslar, M; Coupeau, C; Bonneville, J

    2013-01-01

    Atomic force microscopy (AFM) is used more and more routinely to study, at the nanometre scale, the slip traces produced on the surface of deformed crystalline materials. Taking full advantage of the quantitative height data of the slip traces, which can be extracted from these observations, requires however an adequate and robust processing of the images. In this paper an original method is presented, which allows the fitting of AFM scan-lines with a specific parameterized step function without any averaging treatment of the original data. This yields a quantitative and full description of the changes in step shape along the slip trace. The strength of the proposed method is established on several typical examples met in plasticity by analysing nano-scale structures formed on the sample surface by emerging dislocations. (paper)

  3. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  4. Different methods to quantify Listeria monocytogenesbiofilms cells showed different profile in their viability

    Directory of Open Access Journals (Sweden)

    Lizziane Kretli Winkelströter

    2015-03-01

    Full Text Available Listeria monocytogenes is a foodborne pathogen able to adhere and to form biofilms in several materials commonly present in food processing plants. The aim of this study was to evaluate the resistance of Listeria monocytogenes attached to abiotic surface, after treatment with sanitizers, by culture method, microscopy and Quantitative Real Time Polymerase Chain Reaction (qPCR. Biofilms of L. monocytogenes were obtained in stainless steel coupons immersed in Brain Heart Infusion Broth, under agitation at 37 °C for 24 h. The methods selected for this study were based on plate count, microscopic count with the aid of viability dyes (CTC-DAPI, and qPCR. Results of culture method showed that peroxyacetic acid was efficient to kill sessile L. monocytogenes populations, while sodium hypochlorite was only partially effective to kill attached L. monocytogenes (p < 0.05. When, viability dyes (CTC/DAPI combined with fluorescence microscopy and qPCR were used and lower counts were found after treatments (p < 0.05. Selective quantification of viable cells of L. monocytogenes by qPCR using EMA revelead that the pre-treatment with EMA was not appropriate since it also inhibited amplification of DNA from live cells by ca. 2 log. Thus, the use of CTC counts was the best method to count viable cells in biofilms.

  5. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    Science.gov (United States)

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  6. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com [Airbus Group Innovations, Munich (Germany); Grosse, Christian, E-mail: Grosse@tum.de [Technical University Munich (Germany)

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  7. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    International Nuclear Information System (INIS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-01-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented

  8. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  9. Validated ¹H and 13C Nuclear Magnetic Resonance Methods for the Quantitative Determination of Glycerol in Drug Injections.

    Science.gov (United States)

    Lu, Jiaxi; Wang, Pengli; Wang, Qiuying; Wang, Yanan; Jiang, Miaomiao

    2018-05-15

    In the current study, we employed high-resolution proton and carbon nuclear magnetic resonance spectroscopy (¹H and 13 C NMR) for quantitative analysis of glycerol in drug injections without any complex pre-treatment or derivatization on samples. The established methods were validated with good specificity, linearity, accuracy, precision, stability, and repeatability. Our results revealed that the contents of glycerol were convenient to calculate directly via the integration ratios of peak areas with an internal standard in ¹H NMR spectra, while the integration of peak heights were proper for 13 C NMR in combination with an external calibration of glycerol. The developed methods were both successfully applied in drug injections. Quantitative NMR methods showed an extensive prospect for glycerol determination in various liquid samples.

  10. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  11. New 'ex vivo' radioisotopic method of quantitation of platelet deposition

    Energy Technology Data Exchange (ETDEWEB)

    Badimon, L.; Fuster, V.; Chesebro, J.H.; Dewanjee, M.K.

    1983-01-01

    We have developed a sensitive and quantitative method of 'ex vivo' evaluation of platelet deposition on collagen strips, from rabbit Achilles tendon, superfused by flowing blood and applied it to four animal species, cat, rabbit, dog and pig. Autologous platelets were labeled with indium-111-tropolone, injected to the animal 24 hr before the superfusion and the number of deposited platelets was quantitated from the tendon gamma-radiation and the blood platelet count. We detected some platelet consumption with superfusion time when blood was reinfused entering the contralateral jugular vein after collagen contact but not if blood was discarded after the contact. Therefore, in order to have a more physiological animal model we decided to discard blood after superfusion of the tendon. In all species except for the cat there was a linear relationship between increase of platelet on the tendon and time of exposure to blood superfusion. The highest number of platelets deposited on the collagen was found in cats, the lowest in dogs. Ultrastructural analysis showed the platelets were deposited as aggregates after only 5 min of superfusion.

  12. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    Science.gov (United States)

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  13. Quantitative functional scintigraphy of the salivary glands: A new method of interpreting and clinical results

    International Nuclear Information System (INIS)

    Schneider, P.; Trauring, G.; Haas, J.P.; Noodt, A.; Draf, W.

    1984-01-01

    Tc-99m pertechnetate is injected i.v. and the kinetics of the tracer in the salivary glands is analyzed using a gamma camera and a computer system. To visualize regional gland function, phase images as well as socalled gradient images are generated, which reflect the rate of tracer inflow and outflow. The time activity curves for the individual glands which are obtained with the ROI technique show an initial rise which reflects the pertechnetate uptake potential of the gland and is superimposed by background activity. After a standardized lemon juice dose the curve drops steeply, with the slope depending on the outflow potential of the gland and the background activity. In the past, attempts at quantifying the uptake and elimination functions have failed because of problems in allowing for the variable background component of the time activity curves, which normally amounts of about 60%. In 25 patients in whom one gland had been removed surgically the background activity was examined in terms of the time course and the regional pattern and a patient and gland-specific subtraction method was developed for visualizing the time activity curves of isolated glands devoid of any background activity and describing the uptake and elimination potentials in quantitative terms. Using this new method we evaluated 305 salivary gland scans. Normal ranges for the quantitative parameters were established and their reproducibility was examined. Unlike qualitative functional images of the salivary glands the new quantitative method offers accurate evidence of the extent of gland function and thus helps to decide wether a gland should be salvaged or not (conservative versus surgical treatment). However, quantitation does not furnish any clues on the benign or malignant nature of a tumor. (Author)

  14. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  15. Report of the methods for quantitative organ evaluation in nuclear medicine

    International Nuclear Information System (INIS)

    Nakata, Shigeru; Akagi, Naoki; Mimura, Hiroaki; Nagaki, Akio; Takahashi, Yasuyuki

    1999-01-01

    The group for the methods in the title herein reported the summary of their investigations on literatures concerning the brain, heart, liver and kidney evaluation. The report consisted of the history, kinetics of the agents, methods for quantitative evaluation and summary for each organ. As for the brain, the quantitative evaluation of cerebral blood flow scintigraphy with 123 I-IMP and 99m Tc-HMPAO or -ECD were reviewed to conclude that the present convenient methods are of problems in precision, for which a novel method and/or tracer should be developed. For cardiac functions, there is a method based either on the behavior of tracer in the blood which is excellent in reproducibility, or on the morphology of cardiac wall of which images can be analyzed alternatively by CT and MRI. For these, 131 I-albumin, 99m Tc-albumin, -red blood cells, -MIBI and -tetrofosmin have been used. For myocardium, 201 Tl has been used to evaluate the ischemic region and, with simultaneous use of 99m Tc-MIBI or -tetrofosmin, the viability. 123 I-BMIPP and -MIBG have been developed for myocardial fatty acid metabolism and for cardiac sympathetic nerve functions. Liver functions have been evaluated by the blood elimination rate, hepatic uptake, hepatic elimination and hepatic blood flow with use of 99m Tc-labeled colloids, -PMT and -GSA. Quantitative evaluation of renal functions is now well established with high precision since the kinetic behavior of the tracers, like 99m Tc-DTPA, -MAG3, -DMSA and 131 I-OIH, is simple. (K.H.)

  16. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  17. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    Science.gov (United States)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as 1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  18. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  19. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    Science.gov (United States)

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  20. A method for improved clustering and classification of microscopy images using quantitative co-localization coefficients

    LENUS (Irish Health Repository)

    Singan, Vasanth R

    2012-06-08

    AbstractBackgroundThe localization of proteins to specific subcellular structures in eukaryotic cells provides important information with respect to their function. Fluorescence microscopy approaches to determine localization distribution have proved to be an essential tool in the characterization of unknown proteins, and are now particularly pertinent as a result of the wide availability of fluorescently-tagged constructs and antibodies. However, there are currently very few image analysis options able to effectively discriminate proteins with apparently similar distributions in cells, despite this information being important for protein characterization.FindingsWe have developed a novel method for combining two existing image analysis approaches, which results in highly efficient and accurate discrimination of proteins with seemingly similar distributions. We have combined image texture-based analysis with quantitative co-localization coefficients, a method that has traditionally only been used to study the spatial overlap between two populations of molecules. Here we describe and present a novel application for quantitative co-localization, as applied to the study of Rab family small GTP binding proteins localizing to the endomembrane system of cultured cells.ConclusionsWe show how quantitative co-localization can be used alongside texture feature analysis, resulting in improved clustering of microscopy images. The use of co-localization as an additional clustering parameter is non-biased and highly applicable to high-throughput image data sets.

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  2. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    Science.gov (United States)

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  4. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  5. An isotope-labeled chemical derivatization method for the quantitation of short-chain fatty acids in human feces by liquid chromatography–tandem mass spectrometry

    International Nuclear Information System (INIS)

    Han, Jun; Lin, Karen; Sequeira, Carita; Borchers, Christoph H.

    2015-01-01

    Highlights: • 3-Nitrophenylhydrazine was used to derivatize short-chain fatty acids (SCFAs) for LC-MS/MS. • 13 C 6 analogues were produced for use as isotope-labeled internal standards. • Isotope-labeled standards compensate for ESI matrix effects in LC-MS/MS. • Femtomolar sensitivities and 93–108% quantitation accuracy were achieved for human fecal SCFAs. - Abstract: Short-chain fatty acids (SCFAs) are produced by anaerobic gut microbiota in the large bowel. Qualitative and quantitative measurements of SCFAs in the intestinal tract and the fecal samples are important to understand the complex interplay between diet, gut microbiota and host metabolism homeostasis. To develop a new LC-MS/MS method for sensitive and reliable analysis of SCFAs in human fecal samples, 3-nitrophenylhydrazine (3NPH) was employed for pre-analytical derivatization to convert ten C 2 –C 6 SCFAs to their 3-nitrophenylhydrazones under a single set of optimized reaction conditions and without the need of reaction quenching. The derivatives showed excellent in-solution chemical stability. They were separated on a reversed-phase C 18 column and quantitated by negative-ion electrospray ionization – multiple-reaction monitoring (MRM)/MS. To achieve accurate quantitation, the stable isotope-labeled versions of the derivatives were synthesized in a single reaction vessel from 13 C 6 -3NPH, and were used as internal standard to compensate for the matrix effects in ESI. Method validation showed on-column limits of detection and quantitation over the range from low to high femtomoles for the ten SCFAs, and the intra-day and inter-day precision for determination of nine of the ten SCFAs in human fecal samples was ≤8.8% (n = 6). The quantitation accuracy ranged from 93.1% to 108.4% (CVs ≤ 4.6%, n = 6). This method was used to determine the SCFA concentrations and compositions in six human fecal samples. One of the six samples, which was collected from a clinically diagnosed type 2

  6. An isotope-labeled chemical derivatization method for the quantitation of short-chain fatty acids in human feces by liquid chromatography–tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Han, Jun; Lin, Karen; Sequeira, Carita [University of Victoria – Genome BC Proteomics Centre, University of Victoria, Vancouver Island Technology Park, 3101–4464 Markham Street, Victoria, BC V8Z 7X8 (Canada); Borchers, Christoph H., E-mail: christoph@proteincentre.com [University of Victoria – Genome BC Proteomics Centre, University of Victoria, Vancouver Island Technology Park, 3101–4464 Markham Street, Victoria, BC V8Z 7X8 (Canada); Department of Biochemistry and Microbiology, University of Victoria, Petch Building Room 207, 3800 Finnerty Road, Victoria, BC V8P 5C2 (Canada)

    2015-01-07

    Highlights: • 3-Nitrophenylhydrazine was used to derivatize short-chain fatty acids (SCFAs) for LC-MS/MS. • {sup 13}C{sub 6} analogues were produced for use as isotope-labeled internal standards. • Isotope-labeled standards compensate for ESI matrix effects in LC-MS/MS. • Femtomolar sensitivities and 93–108% quantitation accuracy were achieved for human fecal SCFAs. - Abstract: Short-chain fatty acids (SCFAs) are produced by anaerobic gut microbiota in the large bowel. Qualitative and quantitative measurements of SCFAs in the intestinal tract and the fecal samples are important to understand the complex interplay between diet, gut microbiota and host metabolism homeostasis. To develop a new LC-MS/MS method for sensitive and reliable analysis of SCFAs in human fecal samples, 3-nitrophenylhydrazine (3NPH) was employed for pre-analytical derivatization to convert ten C{sub 2}–C{sub 6} SCFAs to their 3-nitrophenylhydrazones under a single set of optimized reaction conditions and without the need of reaction quenching. The derivatives showed excellent in-solution chemical stability. They were separated on a reversed-phase C{sub 18} column and quantitated by negative-ion electrospray ionization – multiple-reaction monitoring (MRM)/MS. To achieve accurate quantitation, the stable isotope-labeled versions of the derivatives were synthesized in a single reaction vessel from {sup 13}C{sub 6}-3NPH, and were used as internal standard to compensate for the matrix effects in ESI. Method validation showed on-column limits of detection and quantitation over the range from low to high femtomoles for the ten SCFAs, and the intra-day and inter-day precision for determination of nine of the ten SCFAs in human fecal samples was ≤8.8% (n = 6). The quantitation accuracy ranged from 93.1% to 108.4% (CVs ≤ 4.6%, n = 6). This method was used to determine the SCFA concentrations and compositions in six human fecal samples. One of the six samples, which was collected from a

  7. A method for the quantitative determination of crystalline phases by X-ray

    Science.gov (United States)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  8. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Directory of Open Access Journals (Sweden)

    Melanie I Stefan

    2015-04-01

    Full Text Available The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014 show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  9. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Science.gov (United States)

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  10. VERIFICATION HPLC METHOD OF QUANTITATIVE DETERMINATION OF AMLODIPINE IN TABLETS

    Directory of Open Access Journals (Sweden)

    Khanin V. A

    2014-10-01

    Full Text Available Introduction. Amlodipine ((±-2-[(2-aminoetoksimethyl]-4-(2-chlorophenyl-1,4-dihydro-6-methyl-3,5-pyridine dicarboxylic acid 3-ethyl 5-methyl ester as besylate and small tally belongs to the group of selective long-acting calcium channel blockers, dihydropyridine derivatives. In clinical practice, as antianginal and antihypertensive agent for the treatment of cardiovascular diseases. It is produced in powder form, substance and finished dosage forms (tablets of 2.5, 5 and 10 mg. The scientific literature describes methods of quantitative determination of the drug by spectrophotometry – by his own light absorption and by reaction product with aloksan, chromatography techniques, kinetic-spectrophotometric method in substances and preparations and methods chromatomass spectrometry and stripping voltammetry. For the quantitative determination of amlodipine besylate British Pharmacopoeia and European Pharmacopoeia recommend the use of liquid chromatography method. In connection with the establishment of the second edition of SPhU and when it is comprised of articles on the finished product, we set out to analyze the characteristics of the validation of chromatographic quantitative determination of amlodipine besylate tablets and to verify the analytical procedure. Material & methods. In conducting research using substance amlodipine besylate series number AB0401013. Analysis subject pill “Amlodipine” series number 20113 manufacturer of “Pharmaceutical company “Zdorovye”. Analytical equipment used is: 2695 chromatograph with diode array detector 2996 firms Waters Corp. USA using column Nova-Pak C18 300 x 3,9 mm with a particle size of 4 μm, weight ER-182 company AND Japan, measuring vessel class A. Preparation of the test solution. To accurately sample powder tablets equivalent to 50 mg amlodipine, add 30 ml of methanol, shake for 30 minutes, dilute the solution to 50.0 ml with methanol and filtered. 5 ml of methanol solution adjusted to

  11. A Quantitative Analytical Method to Test for Salt Effects on Giant Unilamellar Vesicles

    DEFF Research Database (Denmark)

    Hadorn, Maik; Bönzli, Eva; Eggenberger Hotz, Peter

    2011-01-01

    preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes....

  12. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  13. QUANTITATIVE EVALUATION METHOD OF ELEMENTS PRIORITY OF CARTOGRAPHIC GENERALIZATION BASED ON TAXI TRAJECTORY DATA

    Directory of Open Access Journals (Sweden)

    Z. Long

    2017-09-01

    Full Text Available Considering the lack of quantitative criteria for the selection of elements in cartographic generalization, this study divided the hotspot areas of passengers into parts at three levels, gave them different weights, and then classified the elements from the different hotspots. On this basis, a method was proposed to quantify the priority of elements selection. Subsequently, the quantitative priority of different cartographic elements was summarized based on this method. In cartographic generalization, the method can be preferred to select the significant elements and discard those that are relatively non-significant.

  14. An improved transmutation method for quantitative determination of the components in multicomponent overlapping chromatograms.

    Science.gov (United States)

    Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong

    2004-06-01

    An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.

  15. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  16. A method of quantitative prediction for sandstone type uranium deposit in Russia and its application

    International Nuclear Information System (INIS)

    Chang Shushuai; Jiang Minzhong; Li Xiaolu

    2008-01-01

    The paper presents the foundational principle of quantitative predication for sandstone type uranium deposits in Russia. Some key methods such as physical-mathematical model construction and deposits prediction are described. The method has been applied to deposits prediction in Dahongshan region of Chaoshui basin. It is concluded that the technique can fortify the method of quantitative predication for sandstone type uranium deposits, and it could be used as a new technique in China. (authors)

  17. Biological characteristics of crucian by quantitative inspection method

    Science.gov (United States)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  18. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    Science.gov (United States)

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more

  19. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  20. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    International Nuclear Information System (INIS)

    Chen, Q G; Xu, Y; Zhu, H H; Chen, H; Lin, B

    2015-01-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565–750 nm. The spectral parameter, defined as the ratio of wavebands at 565–750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66–1.06, 1.06–1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems. (paper)

  1. A simple economic and quantitative method for the removal of uranium from Gogi mine water using powdered red brick

    International Nuclear Information System (INIS)

    Nathan, Usha; Cyriac, Bincy; Hegde, G.N.; Premadas, A.; Rai, A.K.

    2011-01-01

    A simple and economical method for the removal of uranium from Gogi mine water using the powdered red brick as a good adsorbent is discussed. Preliminary studies for the removal of uranium using brick showed encouraging results. Further studies were carried to find the amount and size of brick for the quantitative removal of uranium. The results of these studies showed that 50 g of brick with 10 mesh size was enough to remove uranium quantitatively from 100 ml of mine water containing 1800 μg/L of uranium. However the column studies indicated considerable decrease (∼ 5 g for 100 ml of mine water) in the amount of brick required to remove uranium from 100 ml of mine water

  2. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    Science.gov (United States)

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  4. Informatics methods to enable sharing of quantitative imaging research data.

    Science.gov (United States)

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  6. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    Science.gov (United States)

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  7. Investigation of Stress Corrosion Cracking in Magnesium Alloys by Quantitative Fractography Methods

    Directory of Open Access Journals (Sweden)

    Sozańska M.

    2017-06-01

    Full Text Available The article shows that the use of quantitative fracture description may lead to significant progress in research on the phenomenon of stress corrosion cracking of the WE43 magnesium alloy. Tests were carried out on samples in air, and after hydrogenation in 0.1 M Na2SO4 with cathodic polarization. Fracture surfaces were analyzed after different variants of the Slow Strain Rate Test. It was demonstrated that the parameters for quantitative evaluation of fracture surface microcracks can be closely linked with the susceptibility of the WE43 magnesium alloy operating under complex state of the mechanical load in corrosive environments. The final result of the study was the determination of the quantitative relationship between Slow Strain Rate Test parameters, the mechanical properties, and the parameters of the quantitative evaluation of fracture surface (microcracks.

  8. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    Science.gov (United States)

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  9. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    Directory of Open Access Journals (Sweden)

    Joanna Kwiecinska-Piróg

    2014-12-01

    Full Text Available Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride and CV (crystal violet application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters. The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  10. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    Science.gov (United States)

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  11. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  12. Conductance method for quantitative determination of Photobacterium phosphoreum in fish products

    DEFF Research Database (Denmark)

    Dalgaard, Paw; Mejlholm, Ole; Huss, Hans Henrik

    1996-01-01

    This paper presents the development of a sensitive and selective conductance method for quantitative determination of Photobacterium phosphoreum in fresh fish. A calibration curve with a correlation coefficient of -0.981 was established from conductance detection times (DT) for estimation of cell...

  13. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    Science.gov (United States)

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  14. Deep neural nets as a method for quantitative structure-activity relationships.

    Science.gov (United States)

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable.

  15. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  16. A quantitative method for risk assessment of agriculture due to climate change

    Science.gov (United States)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  17. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    NARCIS (Netherlands)

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from

  18. Application of quantitative real-time PCR compared to filtration methods for the enumeration of Escherichia coli in surface waters within Vietnam.

    Science.gov (United States)

    Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W

    2017-02-01

    Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.

  19. A method of quantitative risk assessment for transmission pipeline carrying natural gas

    International Nuclear Information System (INIS)

    Jo, Young-Do; Ahn, Bum Jong

    2005-01-01

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline

  20. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  1. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  2. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  3. Comparison of two methods of quantitation in human studies of biodistribution and radiation dosimetry

    International Nuclear Information System (INIS)

    Smith, T.

    1992-01-01

    A simple method of quantitating organ radioactivity content for dosimetry purposes based on relationships between organ count rate and the initial whole body count rate, has been compared with a more rigorous method of absolute quantitation using a transmission scanning technique. Comparisons were on the basis of organ uptake (% administered activity) and resultant organ radiation doses (mGy MBq -1 ) in 6 normal male volunteers given a 99 Tc m -labelled myocardial perfusion imaging agent intravenously at rest and following exercise. In these studies, estimates of individual organ uptakes by the simple method were in error by between +24 and -16% compared with the more accurate method. However, errors on organ dose values were somewhat less and the effective dose was correct to within 3%. (Author)

  4. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    Science.gov (United States)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  5. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    Science.gov (United States)

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  6. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  7. A comparison of visual and quantitative methods to identify interstitial lung abnormalities

    OpenAIRE

    Kliment, Corrine R.; Araki, Tetsuro; Doyle, Tracy J.; Gao, Wei; Dupuis, Jos?e; Latourelle, Jeanne C.; Zazueta, Oscar E.; Fernandez, Isis E.; Nishino, Mizuki; Okajima, Yuka; Ross, James C.; Est?par, Ra?l San Jos?; Diaz, Alejandro A.; Lederer, David J.; Schwartz, David A.

    2015-01-01

    Background: Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density met...

  8. [Research on rapid and quantitative detection method for organophosphorus pesticide residue].

    Science.gov (United States)

    Sun, Yuan-Xin; Chen, Bing-Tai; Yi, Sen; Sun, Ming

    2014-05-01

    -' happens. The above-mentioned experimental results show that the proposed method is effective and feasible for rapid and quantitative detection prediction for organophosphorus pesticide residues. In the method, the information in full spectrum especially UV-Vis spectrum is strengthened by chromogenic reaction of a colorimetric reagent, which provides a new way of rapid detection of pesticide residues for agricultural products in the future.

  9. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  10. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  11. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    Science.gov (United States)

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  12. Clustering and training set selection methods for improving the accuracy of quantitative laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    Anderson, Ryan B.; Bell, James F.; Wiens, Roger C.; Morris, Richard V.; Clegg, Samuel M.

    2012-01-01

    We investigated five clustering and training set selection methods to improve the accuracy of quantitative chemical analysis of geologic samples by laser induced breakdown spectroscopy (LIBS) using partial least squares (PLS) regression. The LIBS spectra were previously acquired for 195 rock slabs and 31 pressed powder geostandards under 7 Torr CO 2 at a stand-off distance of 7 m at 17 mJ per pulse to simulate the operational conditions of the ChemCam LIBS instrument on the Mars Science Laboratory Curiosity rover. The clustering and training set selection methods, which do not require prior knowledge of the chemical composition of the test-set samples, are based on grouping similar spectra and selecting appropriate training spectra for the partial least squares (PLS2) model. These methods were: (1) hierarchical clustering of the full set of training spectra and selection of a subset for use in training; (2) k-means clustering of all spectra and generation of PLS2 models based on the training samples within each cluster; (3) iterative use of PLS2 to predict sample composition and k-means clustering of the predicted compositions to subdivide the groups of spectra; (4) soft independent modeling of class analogy (SIMCA) classification of spectra, and generation of PLS2 models based on the training samples within each class; (5) use of Bayesian information criteria (BIC) to determine an optimal number of clusters and generation of PLS2 models based on the training samples within each cluster. The iterative method and the k-means method using 5 clusters showed the best performance, improving the absolute quadrature root mean squared error (RMSE) by ∼ 3 wt.%. The statistical significance of these improvements was ∼ 85%. Our results show that although clustering methods can modestly improve results, a large and diverse training set is the most reliable way to improve the accuracy of quantitative LIBS. In particular, additional sulfate standards and specifically

  13. A REVIEW OF QUANTITATIVE METHODS FOR STUDIES OF MINERAL-CONTENT OF INTRAORAL INCIPIENT CARIES LESIONS

    NARCIS (Netherlands)

    TENBOSCH, JJ; ANGMARMANSSON, B

    Modern prospective caries studies require the measurement of small changes in tooth mineral content. Quantitative measurements of changes in mineral content in a single caries lesion is desirable. Quantitative methods can be either destructive or non-destructive. The latter type permits longitudinal

  14. SOCIOLOGICAL MEDIA: MAXIMIZING STUDENT INTEREST IN QUANTITATIVE METHODS VIA COLLABORATIVE USE OF DIGITAL MEDIA

    Directory of Open Access Journals (Sweden)

    Frederick T. Tucker

    2016-10-01

    Full Text Available College sociology lecturers are tasked with inspiring student interest in quantitative methods despite widespread student anxiety about the subject, and a tendency for students to relieve classroom anxiety through habitual web browsing. In this paper, the author details the results of a pedagogical program whereby students at a New York City community college used industry-standard software to design, conduct, and analyze sociological surveys of one another, with the aim of inspiring student interest in quantitative methods and enhancing technical literacy. A chi-square test of independence was performed to determine the effect of the pedagogical process on the students’ ability to discuss sociological methods unrelated to their surveys in their final papers, compared with the author’s students from the previous semester who did not undergo the pedagogical program. The relation between these variables was significant, χ 2(3, N=36 = 9.8, p = .02. Findings suggest that community college students, under lecturer supervision, with minimal prior statistical knowledge, and access to digital media can collaborate in small groups to create and conduct sociological surveys, and discuss methods and results in limited classroom time. College sociology lecturers, instead of combatting student desire to use digital media, should harness this desire to advance student mastery of quantitative methods.

  15. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  16. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    Energy Technology Data Exchange (ETDEWEB)

    Omata, Noriyasu; Shirayama, Susumu, E-mail: omata@nakl.t.u-tokyo.ac.jp, E-mail: sirayama@sys.t.u-tokyo.ac.jp [Department of Systems Innovation, School of Engineering, The University of Tokyo, Hongo 7-3-1, Bunkyo-ku, Tokyo, 113-8656 (Japan)

    2017-10-15

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  17. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    International Nuclear Information System (INIS)

    Omata, Noriyasu; Shirayama, Susumu

    2017-01-01

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  18. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    Science.gov (United States)

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  19. Performance of a new quantitative method for assessing dural ectasia in patients with FBN1 mutations and clinical features of Marfan syndrome

    International Nuclear Information System (INIS)

    Soeylen, Bahar; Schmidtke, Joerg; Arslan-Kirchner, Mine; Hinz, Kerstin; Prokein, Jana; Becker, Hartmut

    2009-01-01

    This study presents a comparison of established methods for measuring dural ectasia with a new quantitative method of assessing this clinical feature. Seventeen patients with an identified mutation in FBN1 were examined for dural ectasia. The results were compared with 17 age- and sex-matched controls. Our images were also evaluated using the two methods of quantifying dural ectasia, namely those of Ahn et al. and of Oosterhof et al. With our method, 80% MFS1 patients and 7% controls fulfilled the criterion for dural ectasia. Using the method of Oosterhof et al., dural ectasia was found in 88% patients with MFS1 and in 47% controls. Using the method of Ahn et al. 76% patients with Marfan syndrome and 29% controls showed dural ectasia. We present a novel quantitative method of evaluating MRT images for dural ectasia, which, in our own patient cohort, performed better than those previously described. (orig.)

  20. Performance of a new quantitative method for assessing dural ectasia in patients with FBN1 mutations and clinical features of Marfan syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Soeylen, Bahar; Schmidtke, Joerg; Arslan-Kirchner, Mine [Hannover Medical School, Institute of Human Genetics, Hannover (Germany); Hinz, Kerstin [Hannover Medical School, Institute of Diagnostic and Interventional Neuroradiology, Hannover (Germany); Vivantes Klinikum Neukoelln, Institut fuer Radiologie und Interventionelle Therapie, Berlin (Germany); Prokein, Jana [Hannover Medical School, Institute for Biometrics, Hannover (Germany); Becker, Hartmut [Hannover Medical School, Institute of Diagnostic and Interventional Neuroradiology, Hannover (Germany)

    2009-06-15

    This study presents a comparison of established methods for measuring dural ectasia with a new quantitative method of assessing this clinical feature. Seventeen patients with an identified mutation in FBN1 were examined for dural ectasia. The results were compared with 17 age- and sex-matched controls. Our images were also evaluated using the two methods of quantifying dural ectasia, namely those of Ahn et al. and of Oosterhof et al. With our method, 80% MFS1 patients and 7% controls fulfilled the criterion for dural ectasia. Using the method of Oosterhof et al., dural ectasia was found in 88% patients with MFS1 and in 47% controls. Using the method of Ahn et al. 76% patients with Marfan syndrome and 29% controls showed dural ectasia. We present a novel quantitative method of evaluating MRT images for dural ectasia, which, in our own patient cohort, performed better than those previously described. (orig.)

  1. A quantitative method to determine the orientation of collagen fibers in the dermis

    NARCIS (Netherlands)

    Noorlander, Maril L.; Melis, Paris; Jonker, Ard; van Noorden, Cornelis J. F.

    2002-01-01

    We have developed a quantitative microscopic method to determine changes in the orientation of collagen fibers in the dermis resulting from mechanical stress. The method is based on the use of picrosirius red-stained cryostat sections of piglet skin in which collagen fibers reflect light strongly

  2. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  3. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  4. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    Science.gov (United States)

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  5. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  6. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    Science.gov (United States)

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  7. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    Science.gov (United States)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  8. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors.

    Science.gov (United States)

    Cox, Annie M; Goodwin, Kelly D

    2013-08-15

    The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications. Published by Elsevier Ltd.

  9. Advantages of a Dynamic RGGG Method in Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2009-01-01

    Various researches have been conducted in order to analyze dynamic interactions among components and process variables in nuclear power plants which cannot be handled by static reliability analysis methods such as conventional fault tree and event tree techniques. A dynamic reliability graph with general gates (RGGG) method was proposed for an intuitive modeling of dynamic systems and it enables one to easily analyze huge and complex systems. In this paper, advantages of the dynamic RGGG method are assessed through two stages: system modeling and quantitative analysis. And then a software tool for dynamic RGGG method is introduced and an application to a real dynamic system is accompanied

  10. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  11. A method for three-dimensional quantitative observation of the microstructure of biological samples

    Science.gov (United States)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  12. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    Science.gov (United States)

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi

  13. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    Science.gov (United States)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg 65 wt%, regardless of pretreatment gas and pressure, with

  14. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    Directory of Open Access Journals (Sweden)

    Ye Zhi-Qiang

    2011-08-01

    Full Text Available Abstract Background Differential coexpression analysis (DCEA is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links. Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum.

  15. The value of quantitative methods for assessment of renal transplant and comparison with physician expertness

    International Nuclear Information System (INIS)

    Firouzi, F.; Fazeli, M.

    2002-01-01

    Radionuclide renal diagnostic studies play an important role in assessing renal allograft. Various quantitative parameters have been derived from the Radionuclide renogram to facilitate and confirm the changes in perfusion and/or function of kidney allograft. These quantitative methods were divided into parameters used for assessing renal graft perfusion and parameters used for evaluating parenchymal function. The blood flow in renal transplants can be quantified by measuring the rate of activity appearance in the kidney graft and the ratio of the integral activity under the transplanted kidney and arterial curves e.g. Hilton's perfusion index and Karachi's kidney/aortic ratio. Quantitative evaluation of graft extraction and excretion was assessed by parameters derived from 123 I/ 131 I-OH, 99 mTc-DTPA or 99 mTc-Mag renogram. In this study we review retrospectively renal transplanted patients scintigraphies that all of them under gone to renal allograft needle biopsy nearly to date of allograft scan. We performed quantitative methods for all patients. We observed perfusion parameters affected by quality of bolus injection and numerical aviations related to changes in the site and size of region of interest. Quantitative methods for renal parenchymal functions were nonspecific and far from defining a specific cause of graft dysfunction. In conclusion, neither perfusion nor parenchymal parameters have not enough diagnostic power for specific diagnosis of graft dysfunction. Physician expertness by using scintigraphic images and renogram curves is more sensitive and specific for diagnosis of renal allograft dysfunction

  16. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    Science.gov (United States)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  17. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    Directory of Open Access Journals (Sweden)

    Angelo Fontana

    2013-09-01

    Full Text Available Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc. or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR. The protocol uses a reference electronic signal as external standard (ERETIC method and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids.

  18. Quantitative method of measuring cancer cell urokinase and metastatic potential

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  19. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  20. Quantitative measurement of blood circulation in tests of rats using nuclear medical methods

    International Nuclear Information System (INIS)

    Ripke, R.

    1980-01-01

    The experiments show that is it is possible to quantitatively assess the blood circulation and, within limits, the germinative function of tests by measuring the impulses of an incorporated radionuclide (99-Tc-pertechnetate) using an uptake measuring instrument. This is a rapid and unbloody method to be adopted in human medicine. 'Acute tests' or pre-damaged tests can thus be exactly diagnosed. In the former case the circulation modification and in the latter the evaluation of the germinative function ability is of main interest. The most important measuring criterion is the 15-minute-uptake U; it represents the blood circulation in the tests measured. The germinative function ability is evaluated on the basis of the accumulation activity Nsub(max). (orig./MG) [de

  1. Unbiased stereological methods used for the quantitative evaluation of guided bone regeneration

    DEFF Research Database (Denmark)

    Aaboe, Else Merete; Pinholt, E M; Schou, S

    1998-01-01

    The present study describes the use of unbiased stereological methods for the quantitative evaluation of the amount of regenerated bone. Using the principle of guided bone regeneration the amount of regenerated bone after placement of degradable or non-degradable membranes covering defects...

  2. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    Science.gov (United States)

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  3. Clustering and training set selection methods for improving the accuracy of quantitative laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Ryan B., E-mail: randerson@astro.cornell.edu [Cornell University Department of Astronomy, 406 Space Sciences Building, Ithaca, NY 14853 (United States); Bell, James F., E-mail: Jim.Bell@asu.edu [Arizona State University School of Earth and Space Exploration, Bldg.: INTDS-A, Room: 115B, Box 871404, Tempe, AZ 85287 (United States); Wiens, Roger C., E-mail: rwiens@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663 MS J565, Los Alamos, NM 87545 (United States); Morris, Richard V., E-mail: richard.v.morris@nasa.gov [NASA Johnson Space Center, 2101 NASA Parkway, Houston, TX 77058 (United States); Clegg, Samuel M., E-mail: sclegg@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663 MS J565, Los Alamos, NM 87545 (United States)

    2012-04-15

    We investigated five clustering and training set selection methods to improve the accuracy of quantitative chemical analysis of geologic samples by laser induced breakdown spectroscopy (LIBS) using partial least squares (PLS) regression. The LIBS spectra were previously acquired for 195 rock slabs and 31 pressed powder geostandards under 7 Torr CO{sub 2} at a stand-off distance of 7 m at 17 mJ per pulse to simulate the operational conditions of the ChemCam LIBS instrument on the Mars Science Laboratory Curiosity rover. The clustering and training set selection methods, which do not require prior knowledge of the chemical composition of the test-set samples, are based on grouping similar spectra and selecting appropriate training spectra for the partial least squares (PLS2) model. These methods were: (1) hierarchical clustering of the full set of training spectra and selection of a subset for use in training; (2) k-means clustering of all spectra and generation of PLS2 models based on the training samples within each cluster; (3) iterative use of PLS2 to predict sample composition and k-means clustering of the predicted compositions to subdivide the groups of spectra; (4) soft independent modeling of class analogy (SIMCA) classification of spectra, and generation of PLS2 models based on the training samples within each class; (5) use of Bayesian information criteria (BIC) to determine an optimal number of clusters and generation of PLS2 models based on the training samples within each cluster. The iterative method and the k-means method using 5 clusters showed the best performance, improving the absolute quadrature root mean squared error (RMSE) by {approx} 3 wt.%. The statistical significance of these improvements was {approx} 85%. Our results show that although clustering methods can modestly improve results, a large and diverse training set is the most reliable way to improve the accuracy of quantitative LIBS. In particular, additional sulfate standards and

  4. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    Science.gov (United States)

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  5. Development of liquid chromatography-tandem mass spectrometry methods for the quantitation of Anisakis simplex proteins in fish.

    Science.gov (United States)

    Fæste, Christiane Kruse; Moen, Anders; Schniedewind, Björn; Haug Anonsen, Jan; Klawitter, Jelena; Christians, Uwe

    2016-02-05

    The parasite Anisakis simplex is present in many marine fish species that are directly used as food or in processed products. The anisakid larvae infect mostly the gut and inner organs of fish but have also been shown to penetrate into the fillet. Thus, human health can be at risk, either by contracting anisakiasis through the consumption of raw or under-cooked fish, or by sensitisation to anisakid proteins in processed food. A number of different methods for the detection of A. simplex in fish and products thereof have been developed, including visual techniques and PCR for larvae tracing, and immunological assays for the determination of proteins. The recent identification of a number of anisakid proteins by mass spectrometry-based proteomics has laid the groundwork for the development of two quantitative liquid chromatography-tandem mass spectrometry methods for the detection of A. simplex in fish that are described in the present study. Both, the label-free semi-quantitative nLC-nESI-Orbitrap-MS/MS (MS1) and the heavy peptide-applying absolute-quantitative (AQUA) LC-TripleQ-MS/MS (MS2) use unique reporter peptides derived from anisakid hemoglobin and SXP/RAL-2 protein as analytes. Standard curves in buffer and in salmon matrix showed limits of detection at 1μg/mL and 10μg/mL for MS1 and 0.1μg/mL and 2μg/mL for MS2. Preliminary method validation included the assessment of sensitivity, repeatability, reproducibility, and applicability to incurred and naturally-contaminated samples for both assays. By further optimization and full validation in accordance with current recommendations the LC-MS/MS methods could be standardized and used generally as confirmative techniques for the detection of A. simplex protein in fish. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A novel semi-quantitative method for measuring tissue bleeding.

    Science.gov (United States)

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  7. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    Science.gov (United States)

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  8. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    Science.gov (United States)

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  9. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  10. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    Science.gov (United States)

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  11. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    International Nuclear Information System (INIS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-01-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine. - Highlights: • Both training and testing samples are considered for analytical lines selection. • The analytical lines are auto-selected based on the built-in characteristics of spectral lines. • The new method can achieve better prediction accuracy and modeling robustness. • Model predictions are given with confidence interval of probabilistic distribution

  12. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    International Nuclear Information System (INIS)

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired

  13. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    Science.gov (United States)

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  14. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  15. Method for the quantitation of steroids in umbilical cord plasma

    International Nuclear Information System (INIS)

    Schindler, A.E.; Sparke, H.

    1975-01-01

    A method for simultaneous quantitation of nine steroids in cord plasma is described which consisted of Amberlite XAD-2 column chromatography at constant temperature of 45 degC, enzyme hydrolysis with β-glucoronidase/aryl sulfatase, addition of five radioactive internal standards, ethyl acetate extraction, thin-layer chromatography and quantitation by gas-liquid chromatography after trimethylsilyl ether derivative formation. Reliability criteria were established and the following steroid concentrations found: progesterone, 132.1+-102.5 μg/100 ml; pregnenolone, 57.3+-45.7 μg/100 ml; dehydroepiandrosterone, 46.5+-29.4 μg/100 ml; pregnanediol, 67.5+-46.6 μg/100 ml; 16-ketoandrostenediol, 19.8+-13.7 μg/100 ml; 16 α-hydroxydehydroepiandrosterone, 126.3+-86.9 μg/100 ml; 16 α-hydroxypregnenolone, 78.2+-56.5 μg/100 ml; androstenetriol, 22.2+-17.5 μg/100 ml and oestriol, 127.7+-116.9 μg/100 ml. (author)

  16. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Tran Khac, Bien Cuong; Chung, Koo-Hyun, E-mail: khchung@ulsan.ac.kr

    2016-02-15

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8–29% smaller than those obtained from the other two methods. This discrepancy decreased to 3–19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. - Highlights: • Quantitative assessment of three lateral force calibration methods for AFM. • Advantages and disadvantages of three different lateral force calibration method. • Implementation of Multi-Load Pivot method as non-contact calibration technique. • The torsional mode correction for Lateral AFM Thermal-Sader method.

  17. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy

    International Nuclear Information System (INIS)

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-01-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8–29% smaller than those obtained from the other two methods. This discrepancy decreased to 3–19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. - Highlights: • Quantitative assessment of three lateral force calibration methods for AFM. • Advantages and disadvantages of three different lateral force calibration method. • Implementation of Multi-Load Pivot method as non-contact calibration technique. • The torsional mode correction for Lateral AFM Thermal-Sader method.

  18. Network 'small-world-ness': a quantitative method for determining canonical network equivalence.

    Directory of Open Access Journals (Sweden)

    Mark D Humphries

    Full Text Available BACKGROUND: Many technological, biological, social, and information networks fall into the broad class of 'small-world' networks: they have tightly interconnected clusters of nodes, and a shortest mean path length that is similar to a matched random graph (same number of nodes and edges. This semi-quantitative definition leads to a categorical distinction ('small/not-small' rather than a quantitative, continuous grading of networks, and can lead to uncertainty about a network's small-world status. Moreover, systems described by small-world networks are often studied using an equivalent canonical network model--the Watts-Strogatz (WS model. However, the process of establishing an equivalent WS model is imprecise and there is a pressing need to discover ways in which this equivalence may be quantified. METHODOLOGY/PRINCIPAL FINDINGS: We defined a precise measure of 'small-world-ness' S based on the trade off between high local clustering and short path length. A network is now deemed a 'small-world' if S>1--an assertion which may be tested statistically. We then examined the behavior of S on a large data-set of real-world systems. We found that all these systems were linked by a linear relationship between their S values and the network size n. Moreover, we show a method for assigning a unique Watts-Strogatz (WS model to any real-world network, and show analytically that the WS models associated with our sample of networks also show linearity between S and n. Linearity between S and n is not, however, inevitable, and neither is S maximal for an arbitrary network of given size. Linearity may, however, be explained by a common limiting growth process. CONCLUSIONS/SIGNIFICANCE: We have shown how the notion of a small-world network may be quantified. Several key properties of the metric are described and the use of WS canonical models is placed on a more secure footing.

  19. An effective method for the quantitative detection of porcine endogenous retrovirus in pig tissues.

    Science.gov (United States)

    Zhang, Peng; Yu, Ping; Wang, Wei; Zhang, Li; Li, Shengfu; Bu, Hong

    2010-05-01

    Xenotransplantation shows great promise for providing a virtually limitless supply of cells, tissues, and organs for a variety of therapeutical procedures. However, the potential of porcine endogenous retrovirus (PERV) as a human-tropic pathogen, particularly as a public health risk, is a major concern for xenotransplantation. This study focus on the detection of copy number in various tissues and organs in Banna Minipig Inbreed (BMI) from 2006 to 2007 in West China Hospital, Sichuan University. Real-time quantitative polymerase chain reaction (SYBR Green I) was performed in this study. The results showed that the pol gene had the most copy number in tissues compared with gag, envA, and envB. Our experiment will offer a rapid and accurate method for the detection of the copy number in various tissues and was especially suitable for the selection of tissues or organs in future clinical xenotransplantation.

  20. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    Directory of Open Access Journals (Sweden)

    Selcuk SAVAS

    Full Text Available ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs with four different quantitative methods. Material and Methods Four windows (3x3 mm were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week in acidified gel system. The test material (MI Varnish was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH, quantitative light-induced fluorescence-digital (QLF-D, energy-dispersive spectroscopy (EDS and laser fluorescence (LF pen. The data were statistically analyzed (α=0.05. Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p0.05. With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05. After the 1- and 4-week treatment periods, the calcium (Ca and phosphate (P concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05. Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use.

  1. Comparison of visual scoring and quantitative planimetry methods for estimation of global infarct size on delayed enhanced cardiac MRI and validation with myocardial enzymes

    Energy Technology Data Exchange (ETDEWEB)

    Mewton, Nathan, E-mail: nmewton@gmail.com [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Revel, Didier [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Bonnefoy, Eric [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); Ovize, Michel [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); INSERM Unite 886 (France); Croisille, Pierre [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France)

    2011-04-15

    Purpose: Although delayed enhanced CMR has become a reference method for infarct size quantification, there is no ideal method to quantify total infarct size in a routine clinical practice. In a prospective study we compared the performance and post-processing time of a global visual scoring method to standard quantitative planimetry and we compared both methods to the peak values of myocardial biomarkers. Materials and methods: This study had local ethics committee approval; all patients gave written informed consent. One hundred and three patients admitted with reperfused AMI to our intensive care unit had a complete CMR study with gadolinium-contrast injection 4 {+-} 2 days after admission. A global visual score was defined on a 17-segment model and compared with the quantitative planimetric evaluation of hyperenhancement. The peak values of serum Troponin I (TnI) and creatine kinase (CK) release were measured in each patient. Results: The mean percentage of total left ventricular myocardium with hyperenhancement determined by the quantitative planimetry method was (20.1 {+-} 14.6) with a range of 1-68%. There was an excellent correlation between quantitative planimetry and visual global scoring for the hyperenhancement extent's measurement (r = 0.94; y = 1.093x + 0.87; SEE = 1.2; P < 0.001) The Bland-Altman plot showed a good concordance between the two approaches (mean of the differences = 1.9% with a standard deviation of 4.7). Mean post-processing time for quantitative planimetry was significantly longer than visual scoring post-processing time (23.7 {+-} 5.7 min vs 5.0 {+-} 1.1 min respectively, P < 0.001). Correlation between peak CK and quantitative planimetry was r = 0.82 (P < 0.001) and r = 0.83 (P < 0.001) with visual global scoring. Correlation between peak Troponin I and quantitative planimetry was r = 0.86 (P < 0.001) and r = 0.85 (P < 0.001) with visual global scoring. Conclusion: A visual approach based on a 17-segment model allows a rapid

  2. Comparison of visual scoring and quantitative planimetry methods for estimation of global infarct size on delayed enhanced cardiac MRI and validation with myocardial enzymes

    International Nuclear Information System (INIS)

    Mewton, Nathan; Revel, Didier; Bonnefoy, Eric; Ovize, Michel; Croisille, Pierre

    2011-01-01

    Purpose: Although delayed enhanced CMR has become a reference method for infarct size quantification, there is no ideal method to quantify total infarct size in a routine clinical practice. In a prospective study we compared the performance and post-processing time of a global visual scoring method to standard quantitative planimetry and we compared both methods to the peak values of myocardial biomarkers. Materials and methods: This study had local ethics committee approval; all patients gave written informed consent. One hundred and three patients admitted with reperfused AMI to our intensive care unit had a complete CMR study with gadolinium-contrast injection 4 ± 2 days after admission. A global visual score was defined on a 17-segment model and compared with the quantitative planimetric evaluation of hyperenhancement. The peak values of serum Troponin I (TnI) and creatine kinase (CK) release were measured in each patient. Results: The mean percentage of total left ventricular myocardium with hyperenhancement determined by the quantitative planimetry method was (20.1 ± 14.6) with a range of 1-68%. There was an excellent correlation between quantitative planimetry and visual global scoring for the hyperenhancement extent's measurement (r = 0.94; y = 1.093x + 0.87; SEE = 1.2; P < 0.001) The Bland-Altman plot showed a good concordance between the two approaches (mean of the differences = 1.9% with a standard deviation of 4.7). Mean post-processing time for quantitative planimetry was significantly longer than visual scoring post-processing time (23.7 ± 5.7 min vs 5.0 ± 1.1 min respectively, P < 0.001). Correlation between peak CK and quantitative planimetry was r = 0.82 (P < 0.001) and r = 0.83 (P < 0.001) with visual global scoring. Correlation between peak Troponin I and quantitative planimetry was r = 0.86 (P < 0.001) and r = 0.85 (P < 0.001) with visual global scoring. Conclusion: A visual approach based on a 17-segment model allows a rapid and accurate

  3. Quantitative microbial risk assessment (QMRA) shows increased public health risk associated with exposure to river water under conditions of riverbed sediment resuspension

    CSIR Research Space (South Africa)

    Abia

    2016-10-01

    Full Text Available of The Total Environment, 556-557, pp 1143-1151 Quantitative microbial risk assessment (QMRA) shows increased public health risk associated with exposure to river water under conditions of riverbed sediment resuspension Akebe Luther King Abia a...

  4. A quantitative method to analyse an open answer questionnaire: A case study about the Boltzmann Factor

    International Nuclear Information System (INIS)

    Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2015-01-01

    This paper describes a quantitative method to analyse an openended questionnaire. Student responses to a specially designed written questionnaire are quantitatively analysed by not hierarchical clustering called k-means method. Through this we can characterise behaviour students with respect their expertise to formulate explanations for phenomena or processes and/or use a given model in the different context. The physics topic is about the Boltzmann Factor, which allows the students to have a unifying view of different phenomena in different contexts.

  5. Nuclear medicine and imaging research. Instrumentation and quantitative methods of evaluation. Progress report, January 15, 1984-January 14, 1985

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1984-09-01

    This program addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. Project I addresses problems associated with the quantitative imaging of single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures

  6. A SVM-based quantitative fMRI method for resting-state functional network detection.

    Science.gov (United States)

    Song, Xiaomu; Chen, Nan-kuei

    2014-09-01

    Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    Science.gov (United States)

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  8. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    Directory of Open Access Journals (Sweden)

    Linjun Fan

    2014-01-01

    Full Text Available This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA. Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service’s evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA is constructed based on finite state automata (FSA, which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average is the biggest influential factor, the noncomposition of atomic services (13.12% is the second biggest one, and the service version’s confusion (1.2% is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  9. Meta-Analysis of Quantification Methods Shows that Archaea and Bacteria Have Similar Abundances in the Subseafloor

    Science.gov (United States)

    May, Megan K.; Kevorkian, Richard T.; Steen, Andrew D.

    2013-01-01

    There is no universally accepted method to quantify bacteria and archaea in seawater and marine sediments, and different methods have produced conflicting results with the same samples. To identify best practices, we compiled data from 65 studies, plus our own measurements, in which bacteria and archaea were quantified with fluorescent in situ hybridization (FISH), catalyzed reporter deposition FISH (CARD-FISH), polyribonucleotide FISH, or quantitative PCR (qPCR). To estimate efficiency, we defined “yield” to be the sum of bacteria and archaea counted by these techniques divided by the total number of cells. In seawater, the yield was high (median, 71%) and was similar for FISH, CARD-FISH, and polyribonucleotide FISH. In sediments, only measurements by CARD-FISH in which archaeal cells were permeabilized with proteinase K showed high yields (median, 84%). Therefore, the majority of cells in both environments appear to be alive, since they contain intact ribosomes. In sediments, the sum of bacterial and archaeal 16S rRNA gene qPCR counts was not closely related to cell counts, even after accounting for variations in copy numbers per genome. However, qPCR measurements were precise relative to other qPCR measurements made on the same samples. qPCR is therefore a reliable relative quantification method. Inconsistent results for the relative abundance of bacteria versus archaea in deep subsurface sediments were resolved by the removal of CARD-FISH measurements in which lysozyme was used to permeabilize archaeal cells and qPCR measurements which used ARCH516 as an archaeal primer or TaqMan probe. Data from best-practice methods showed that archaea and bacteria decreased as the depth in seawater and marine sediments increased, although archaea decreased more slowly. PMID:24096423

  10. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  11. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    International Nuclear Information System (INIS)

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Larin, Kirill V; Aglyamov, Salavat R; Twa, Michael D

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessment of biomechanical properties of tissues with micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of a proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. (paper)

  12. Examining Elementary Preservice Teachers’ Self-Efficacy Beliefs: Combination of Quantitative and Qualitative Methods

    Directory of Open Access Journals (Sweden)

    Çiğdem ŞAHİN-TAŞKIN

    2010-04-01

    Full Text Available This study examines elementary preservice teachers’ self-efficacy beliefs. Quantitative and qualitative research methods were used in this study. In the quantitative part, data were collected from 122 final year preservice teachers. The instrument developed by Tschannen–Moran and Woolfolk–Hoy (2001 was administered to preservice teachers. Findings of the quantitative part revealed that preservice teachers’ self-efficacy towards teaching profession was not fully adequate. There were no differences amongst preservice teachers’ self-efficacy towards teaching regarding gender and achievement. In the qualitative part of the study, preservice teachers responded to factors involving Student Engagement and Classroom Management based on experiences that they gained in teaching practice. However, their explanation relied on their theoretical knowledge regarding the Instructional Strategies factor. This could be explained as they have lack of experiences regarding this factor

  13. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  14. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    Science.gov (United States)

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  15. Quantitative analysis of multiple high-resolution mass spectrometry images using chemometric methods: quantitation of chlordecone in mouse liver.

    Science.gov (United States)

    Mohammadi, Saeedeh; Parastar, Hadi

    2018-05-15

    In this work, a chemometrics-based strategy is developed for quantitative mass spectrometry imaging (MSI). In this regard, quantification of chlordecone as a carcinogenic organochlorinated pesticide (C10Cll0O) in mouse liver using the matrix-assisted laser desorption ionization MSI (MALDI-MSI) method is used as a case study. The MSI datasets corresponded to 1, 5 and 10 days of mouse exposure to the standard chlordecone in the quantity range of 0 to 450 μg g-1. The binning approach in the m/z direction is used to group high resolution m/z values and to reduce the big data size. To consider the effect of bin size on the quality of results, three different bin sizes of 0.25, 0.5 and 1.0 were chosen. Afterwards, three-way MSI data arrays (two spatial and one m/z dimensions) for seven standards and four unknown samples were column-wise augmented with m/z values as the common mode. Then, these datasets were analyzed using multivariate curve resolution-alternating least squares (MCR-ALS) using proper constraints. The resolved mass spectra were used for identification of chlordecone in the presence of a complex background and interference. Additionally, the augmented spatial profiles were post-processed and 2D images for each component were obtained in calibration and unknown samples. The sum of these profiles was utilized to set the calibration curve and to obtain the analytical figures of merit (AFOMs). Inspection of the results showed that the lower bin size (i.e., 0.25) provides more accurate results. Finally, the obtained results by MCR for three datasets were compared with those of gas chromatography-mass spectrometry (GC-MS) and MALDI-MSI. The results showed that the MCR-assisted method gives a higher amount of chlordecone than MALDI-MSI and a lower amount than GC-MS. It is concluded that a combination of chemometric methods with MSI can be considered as an alternative way for MSI quantification.

  16. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  17. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    Science.gov (United States)

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  18. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  19. Quantitative assessment of breast density: comparison of different methods

    International Nuclear Information System (INIS)

    Qin Naishan; Guo Li; Dang Yi; Song Luxin; Wang Xiaoying

    2011-01-01

    Objective: To Compare different methods of quantitative breast density measurement. Methods: The study included sixty patients who underwent both mammography and breast MRI. The breast density was computed automatically on digital mammograms with R2 workstation, Two experienced radiologists read the mammograms and assessed the breast density with Wolfe and ACR classification respectively. Fuzzy C-means clustering algorithm (FCM) was used to assess breast density on MRI. Each assessment method was repeated after 2 weeks. Spearman and Pearson correlations of inter- and intrareader and intermodality were computed for density estimates. Results: Inter- and intrareader correlation of Wolfe classification were 0.74 and 0.65, and they were 0.74 and 0.82 for ACR classification respectively. Correlation between Wolfe and ACR classification was 0.77. High interreader correlation of 0.98 and intrareader correlation of 0.96 was observed with MR FCM measurement. And the correlation between digital mammograms and MRI was high in the assessment of breast density (r=0.81, P<0.01). Conclusion: High correlation of breast density estimates on digital mammograms and MRI FCM suggested the former could be used as a simple and accurate method. (authors)

  20. Semi-quantitative evaluation of gallium-67 scintigraphy in lupus nephritis

    International Nuclear Information System (INIS)

    Lin Wanyu; Hsieh Jihfang; Tsai Shihchuan; Lan Joungliang; Cheng Kaiyuan; Wang Shyhjen

    2000-01-01

    Within nuclear medicine there is a trend towards quantitative analysis. Gallium renal scan has been reported to be useful in monitoring the disease activity of lupus nephritis. However, only visual interpretation using a four-grade scale has been performed in previous studies, and this method is not sensitive enough for follow-up. In this study, we developed a semi-quantitative method for gallium renal scintigraphy to find a potential parameter for the evaluation of lupus nephritis. Forty-eight patients with lupus nephritis underwent renal biopsy to determine World Health Organization classification, activity index (AI) and chronicity index (CI). A delayed 48-h gallium scan was also performed and interpreted by visual and semi-quantitative methods. For semi-quantitative analysis of the gallium uptake in both kidneys, regions of interest (ROIs) were drawn over both kidneys, the right forearm and the adjacent spine. The uptake ratios between these ROIs were calculated and expressed as the ''kidney/spine ratio (K/S ratio)'' or the ''kidney/arm ratio (K/A ratio)''. Spearman's rank correlation test and Mann-Whitney U test were used for statistical analysis. Our data showed a good correlation between the semi-quantitative gallium scan and the results of visual interpretation. K/S ratios showed a better correlation with AI than did K/A ratios. Furthermore, the left K/S ratio displayed a better correlation with AI than did the right K/S ratio. In contrast, CI did not correlate well with the results of semi-quantitative gallium scan. In conclusion, semi-quantitative gallium renal scan is easy to perform and shows a good correlation with the results of visual interpretation and renal biopsy. The left K/S ratio from semi-quantitative renal gallium scintigraphy displays the best correlation with AI and is a useful parameter in evaluating the disease activity in lupus nephritis. (orig.)

  1. A method for quantitative analysis of clump thickness in cervical cytology slides.

    Science.gov (United States)

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1 μm in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30 μm thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10 μm and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1 μm spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1 μm spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  3. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    Science.gov (United States)

    Manickum, Thavrin; John, Wilson

    2015-07-01

    requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  4. Quantitative genetic methods depending on the nature of the phenotypic trait.

    Science.gov (United States)

    de Villemereuil, Pierre

    2018-01-24

    A consequence of the assumptions of the infinitesimal model, one of the most important theoretical foundations of quantitative genetics, is that phenotypic traits are predicted to be most often normally distributed (so-called Gaussian traits). But phenotypic traits, especially those interesting for evolutionary biology, might be shaped according to very diverse distributions. Here, I show how quantitative genetics tools have been extended to account for a wider diversity of phenotypic traits using first the threshold model and then more recently using generalized linear mixed models. I explore the assumptions behind these models and how they can be used to study the genetics of non-Gaussian complex traits. I also comment on three recent methodological advances in quantitative genetics that widen our ability to study new kinds of traits: the use of "modular" hierarchical modeling (e.g., to study survival in the context of capture-recapture approaches for wild populations); the use of aster models to study a set of traits with conditional relationships (e.g., life-history traits); and, finally, the study of high-dimensional traits, such as gene expression. © 2018 New York Academy of Sciences.

  5. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    Science.gov (United States)

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  7. Quantitative sacroiliac scintigraphy. The effect of method of selection of region of interest

    International Nuclear Information System (INIS)

    Davis, M.C.; Turner, D.A.; Charters, J.R.; Golden, H.E.; Ali, A.; Fordham, E.W.

    1984-01-01

    Various authors have advocated quantitative methods of evaluating bone scintigrams to detect sacroiliitis, while others have not found them useful. Many explanations for this disagreement have been offered, including differences in the method of case selection, ethnicity, gender, and previous drug therapy. It would appear that one of the most important impediments to consistent results is the variability of selecting sacroiliac joint and reference regions of interest (ROIs). The effect of ROI selection would seem particularly important because of the normal variability of radioactivity within the reference regions that have been used (sacrum, spine, iliac wing) and the inhomogeneity of activity in the SI joints. We have investigated the effect of ROI selection, using five different methods representative of, though not necessarily identical to, those found in the literature. Each method produced unique mean indices that were different for patients with ankylosing spondylitis (AS) and controls. The method of Ayres (19) proved superior (largest mean difference, smallest variance), but none worked well as a diagnostic tool because of substantial overlap of the distributions of indices of patient and control groups. We conclude that ROI selection is important in determining results, and quantitative scintigraphic methods in general are not effective tools for diagnosing AS. Among the possible factors limiting success, difficulty in selecting a stable reference area seems of particular importance

  8. A quantitative method for assessing resilience of interdependent infrastructures

    International Nuclear Information System (INIS)

    Nan, Cen; Sansavini, Giovanni

    2017-01-01

    The importance of understanding system resilience and identifying ways to enhance it, especially for interdependent infrastructures our daily life depends on, has been recognized not only by academics, but also by the corporate and public sectors. During recent years, several methods and frameworks have been proposed and developed to explore applicable techniques to assess and analyze system resilience in a comprehensive way. However, they are often tailored to specific disruptive hazards/events, or fail to properly include all the phases such as absorption, adaptation, and recovery. In this paper, a quantitative method for the assessment of the system resilience is proposed. The method consists of two components: an integrated metric for system resilience quantification and a hybrid modeling approach for representing the failure behavior of infrastructure systems. The feasibility and applicability of the proposed method are tested using an electric power supply system as the exemplary infrastructure. Simulation results highlight that the method proves effective in designing, engineering and improving the resilience of infrastructures. Finally, system resilience is proposed as a proxy to quantify the coupling strength between interdependent infrastructures. - Highlights: • A method for quantifying resilience of interdependent infrastructures is proposed. • It combines multi-layer hybrid modeling and a time-dependent resilience metric. • The feasibility of the proposed method is tested on the electric power supply system. • The method provides insights to decision-makers for strengthening system resilience. • Resilience capabilities can be used to engineer interdependencies between subsystems.

  9. The laboratory of quantitative methods in historic monument research at the CTU Prague

    International Nuclear Information System (INIS)

    Musilek, L.; Cechak, T.; Kubelik, M.; Pavelka, K.; Pavlik, M.

    2001-01-01

    A 'Laboratory of Quantitative Methods in Historic Monument Research' has been established at the Department of Dosimetry and Application of Ionizing Radiation of the CTU Prague. Its primary orientation is the investigation of historic architecture, although other objects of art can also be, investigated. In the first phase, one investigative method was established for each of the above groups: X-ray fluorescence as the analytic method, thermoluminescence for dating and photogrammetry for surveying. The first results demonstrate the need and usefulness of these methods for investigations in the rich architectural heritage of the Czech Republic.

  10. Validation of the method of quantitative phase analysis by X-ray diffraction in API: case of Tibolone

    International Nuclear Information System (INIS)

    Silva, R P; Ambrósio, M F S; Epprecht, E K; Avillez, R R; Achete, C A; Kuznetsov, A; Visentin, L C

    2016-01-01

    In this study, different structural and microstructural models applied to X-ray analysis of powder diffraction data of polymorphic mixtures of known concentrations of Tibolone were investigated. The X-ray data obtained in different diffraction instruments were analysed via Rietveld method using the same analytical models. The results of quantitative phase analysis show that regardless of the instrument used, the values of the calculated concentrations follow the same systematics with respect to the final errors. The strategy to select a specific analytical model that leads to lower measurement errors is here presented. (paper)

  11. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    Science.gov (United States)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  12. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    Science.gov (United States)

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  13. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  14. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  15. A quantitative method for determining spatial discriminative capacity

    Directory of Open Access Journals (Sweden)

    Dennis Robert G

    2008-03-01

    Full Text Available Abstract Background The traditional two-point discrimination (TPD test, a widely used tactile spatial acuity measure, has been criticized as being imprecise because it is based on subjective criteria and involves a number of non-spatial cues. The results of a recent study showed that as two stimuli were delivered simultaneously, vibrotactile amplitude discrimination became worse when the two stimuli were positioned relatively close together and was significantly degraded when the probes were within a subject's two-point limen. The impairment of amplitude discrimination with decreasing inter-probe distance suggested that the metric of amplitude discrimination could possibly provide a means of objective and quantitative measurement of spatial discrimination capacity. Methods A two alternative forced-choice (2AFC tracking procedure was used to assess a subject's ability to discriminate the amplitude difference between two stimuli positioned at near-adjacent skin sites. Two 25 Hz flutter stimuli, identical except for a constant difference in amplitude, were delivered simultaneously to the hand dorsum. The stimuli were initially spaced 30 mm apart, and the inter-stimulus distance was modified on a trial-by-trial basis based on the subject's performance of discriminating the stimulus with higher intensity. The experiment was repeated via sequential, rather than simultaneous, delivery of the same vibrotactile stimuli. Results Results obtained from this study showed that the performance of the amplitude discrimination task was significantly degraded when the stimuli were delivered simultaneously and were near a subject's two-point limen. In contrast, subjects were able to correctly discriminate between the amplitudes of the two stimuli when they were sequentially delivered at all inter-probe distances (including those within the two-point limen, and improved when an adapting stimulus was delivered prior to simultaneously delivered stimuli. Conclusion

  16. The development of quantitative determination method of organic acids in complex poly herbal extraction

    Directory of Open Access Journals (Sweden)

    I. L. Dyachok

    2016-08-01

    Full Text Available Aim. The development of sensible, economical and expressive method of quantitative determination of organic acids in complex poly herbal extraction counted on izovaleric acid with the use of digital technologies. Materials and methods. Model complex poly herbal extraction of sedative action was chosen as a research object. Extraction is composed of these medical plants: Valeriana officinalis L., Crataégus, Melissa officinalis L., Hypericum, Mentha piperita L., Húmulus lúpulus, Viburnum. Based on chemical composition of plant components, we consider that main pharmacologically active compounds, which can be found in complex poly herbal extraction are: polyphenolic substances (flavonoids, which are contained in Crataégus, Viburnum, Hypericum, Mentha piperita L., Húmulus lúpulus; also organic acids, including izovaleric acid, which are contained in Valeriana officinalis L., Mentha piperita L., Melissa officinalis L., Viburnum; the aminoacid are contained in Valeriana officinalis L. For the determination of organic acids content in low concentration we applied instrumental method of analysis, namely conductometry titration which consisted in the dependences of water solution conductivity of complex poly herbal extraction on composition of organic acids. Result. The got analytical dependences, which describes tangent lines to the conductometry curve before and after the point of equivalence, allow to determine the volume of solution expended on titration and carry out procedure of quantitative determination of organic acids in the digital mode. Conclusion. The proposed method enables to determine the point of equivalence and carry out quantitative determination of organic acids counted on izovaleric acid with the use of digital technologies, that allows to computerize the method on the whole.

  17. Quantitative prediction of drug side effects based on drug-related features.

    Science.gov (United States)

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  18. Quantitative methods of data analysis for the physical sciences and engineering

    CERN Document Server

    Martinson, Douglas G

    2018-01-01

    This book provides thorough and comprehensive coverage of most of the new and important quantitative methods of data analysis for graduate students and practitioners. In recent years, data analysis methods have exploded alongside advanced computing power, and it is critical to understand such methods to get the most out of data, and to extract signal from noise. The book excels in explaining difficult concepts through simple explanations and detailed explanatory illustrations. Most unique is the focus on confidence limits for power spectra and their proper interpretation, something rare or completely missing in other books. Likewise, there is a thorough discussion of how to assess uncertainty via use of Expectancy, and the easy to apply and understand Bootstrap method. The book is written so that descriptions of each method are as self-contained as possible. Many examples are presented to clarify interpretations, as are user tips in highlighted boxes.

  19. Identification of ginseng root using quantitative X-ray microtomography.

    Science.gov (United States)

    Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao

    2017-07-01

    The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  20. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    Science.gov (United States)

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  1. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    Science.gov (United States)

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  2. Qualitative and quantitative determination of ubiquinones by the method of high-efficiency liquid chromatography

    International Nuclear Information System (INIS)

    Yanotovskii, M.T.; Mogilevskaya, M.P.; Obol'nikova, E.A.; Kogan, L.M.; Samokhvalov, G.I.

    1986-01-01

    A method has been developed for the qualitative and quantitative determination of ubiquinones CoQ 6 -CoQ 10 , using high-efficiency reversed-phase liquid chromatography. Tocopherol acetate was used as the internal standard

  3. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    OpenAIRE

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from smartphones and survey (perception) data. We find that there are dimensions of domestication that explain how the use of smartphones affects our daily routines. Contributions are stronger for downloaded a...

  5. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    Science.gov (United States)

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  6. A novel dual energy method for enhanced quantitative computed tomography

    Science.gov (United States)

    Emami, A.; Ghadiri, H.; Rahmim, A.; Ay, M. R.

    2018-01-01

    Accurate assessment of bone mineral density (BMD) is critically important in clinical practice, and conveniently enabled via quantitative computed tomography (QCT). Meanwhile, dual-energy QCT (DEQCT) enables enhanced detection of small changes in BMD relative to single-energy QCT (SEQCT). In the present study, we aimed to investigate the accuracy of QCT methods, with particular emphasis on a new dual-energy approach, in comparison to single-energy and conventional dual-energy techniques. We used a sinogram-based analytical CT simulator to model the complete chain of CT data acquisitions, and assessed performance of SEQCT and different DEQCT techniques in quantification of BMD. We demonstrate a 120% reduction in error when using a proposed dual-energy Simultaneous Equation by Constrained Least-squares method, enabling more accurate bone mineral measurements.

  7. New chromatographic method for separating Omeprazole from its degradation components and the quantitatively determining it in its pharmaceutical products

    International Nuclear Information System (INIS)

    Touma, M.; Rajab, A.; Seuleiman, M.

    2007-01-01

    New chromatographic method for Quantitative Determination of Omeprazole in its Pharmaceutical Products was produced. Omeprazole and its degradation components were well separated in same chromatogram by using high perfume liquid chromatography (HPLC). The new analytical method has been validated by these characteristic tests (accuracy, precision, range, linearity, specificity/selectivity, limit of detection (LOD) and limit of quantitative (LOQ) ).(author)

  8. New chromatographic Methods for Separation of Lansoprazole from its Degradation Components and The Quantitative Determination in its Pharmaceutical Products

    International Nuclear Information System (INIS)

    Touma, M.; Rajab, A.

    2009-01-01

    New chromatographic method was found for Quantitative Determination of Lansoprazole in its pharmaceutical products. Lansoprazole and its degradation components were well separated in same chromatogram by using high perfume liquid chromatography (HPLC). The new analytical method has been validated by these characteristic tests (accuracy, precision, range, linearity, specificity/selectivity, limit of detection (LOD) and limit of quantitative (LOQ)). (author)

  9. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  10. Quantitative phase analysis of uranium carbide from x-ray diffraction data using the Rietveld method

    International Nuclear Information System (INIS)

    Singh Mudher, K.D.; Krishnan, K.

    2003-01-01

    Quantitative phase analysis of a uranium carbide sample was carried out from the x-ray diffraction data by Rietveld profile fitting method. The method does not require the addition of any reference material. The percentage of UC, UC 2 and UO 2 phases in the sample were determined. (author)

  11. Quantitative evaluations of male pattern baldness.

    Science.gov (United States)

    Tsuji, Y; Ishino, A; Hanzawa, N; Uzuka, M; Okazaki, K; Adachi, K; Imamura, S

    1994-07-01

    Several methods for the evaluation of hair growth have been reported; however, none of the hitherto reported methods are satisfactory as unbiased double blind studies to evaluate the efficacy of hair growth agents. In the present paper, we describe quantitative evaluation methods for hair growth by measuring the anagen ratio and hair diameters in 56 Japanese subjects aged 23-56 for 3 years. The average anagen ratio decreased by 3.8% in 3 years. The average hair diameters showed a statistically significant decrease each year totalling 3.4 microns. Subjects were sorted according to their anagen ratio into 4 groups. Each group showed different distribution patterns of hair diameters. The higher anagen ratio group has a high frequency peak at thicker hair diameters and the lower anagen ratio group has a high frequency peak at thinner hair diameters. The number of thicker hairs decreased and the high frequency peak shifted to thinner hair diameters in 3 years. These methods are useful to evaluate both the progression of male pattern baldness and the effects of hair growth agents with double blind studies in an unbiased quantitative fashion.

  12. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  13. Quantitative analysis and efficiency study of PSD methods for a LaBr{sub 3}:Ce detector

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ming; Cang, Jirong [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Zeng, Zhi, E-mail: zengzhi@tsinghua.edu.cn [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli [Key Laboratory of Particle & Radiation Imaging(Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2016-03-21

    The LaBr{sub 3}:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr{sub 3}:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the {sup 227}Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr{sub 3}:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  14. Validation of a microfluorimetric method for quantitation of L-Histidine in peripheral blood

    International Nuclear Information System (INIS)

    Contreras Roura, Jiovanna; Hernandez Cuervo, Orietta; Alonso Jimenez, Elsa

    2008-01-01

    Histidinemia is a rare inherited metabolic disorder characterized by deficient histidase enzyme, which results in elevated histidine levels in blood, urine and cerebrospinal fluid and, sometimes, hyperalaninemia. Histidinemia clinical picture varies from mental retardation and speech disorders to absence of any symptoms. This disease can be diagnosed by histidine-level-in-blood-quantitating tests using different analytical methods such as spectrofluorimetry and High Pressure Liquid Chromatography. An analytical method using SUMA Technology was developed and validated at our laboratory to determine L-Histidine in blood: serum and dried blood spot (adult and neonatal) so as to use it in Histidinemia screening in children with speech disorders. This paper presents selectivity, linearity, accuracy and precision data. The calibration curve showed linearity ranging 1-12 mg/dL or 64.5-774 μM, and correlation coefficient (r) and determination coefficient (r2) higher than 0.99 for each biological matrix studied were obtained. Accuracy (repeatability and intermediate accuracy assays) was demonstrated, variation coefficients lower than 20 % being obtained. Accuracy was assessed by determining absolute recovery percentage. Assay recoveries were 97.83 -105.50 % (serum), 93-121.50 % (adult spot dried blood) and 86.50-104.50 % (neonatal spot dried blood)

  15. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    International Nuclear Information System (INIS)

    Conconi, M.S.; Gauna, M.R.; Serra, M.F.; Suarez, G.; Aglietti, E.F.; Rendtorff, N.M.

    2014-01-01

    The firing transformations of traditional (clay based) ceramics are of technological and archaeological interest, and are usually reported qualitatively or semi quantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite), the low crystalline (metakaolinite and/or spinel type pre-mullite) and glassy phases evolution of a triaxial (clay-quartz-feldspar) ceramic fired in a wide temperature range between 900 and 1300 deg C. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 deg C) spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy) phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and materials

  16. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Energy Technology Data Exchange (ETDEWEB)

    Conconi, M.S.; Gauna, M.R.; Serra, M.F. [Centro de Tecnologia de Recursos Minerales y Ceramica (CETMIC), Buenos Aires (Argentina); Suarez, G.; Aglietti, E.F.; Rendtorff, N.M., E-mail: rendtorff@cetmic.unlp.edu.ar [Universidad Nacional de La Plata (UNLP), Buenos Aires (Argentina). Fac. de Ciencias Exactas. Dept. de Quimica

    2014-10-15

    The firing transformations of traditional (clay based) ceramics are of technological and archaeological interest, and are usually reported qualitatively or semi quantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite), the low crystalline (metakaolinite and/or spinel type pre-mullite) and glassy phases evolution of a triaxial (clay-quartz-feldspar) ceramic fired in a wide temperature range between 900 and 1300 deg C. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 deg C) spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy) phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and materials

  17. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  18. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Quantitative whole body scintigraphy - a simplified approach

    International Nuclear Information System (INIS)

    Marienhagen, J.; Maenner, P.; Bock, E.; Schoenberger, J.; Eilles, C.

    1996-01-01

    In this paper we present investigations on a simplified method of quantitative whole body scintigraphy by using a dual head LFOV-gamma camera and a calibration algorithm without the need of additional attenuation or scatter correction. Validation of this approach to the anthropomorphic phantom as well as in patient studies showed a high accuracy concerning quantification of whole body activity (102.8% and 97.72%, resp.), by contrast organ activities were recovered with an error range up to 12%. The described method can be easily performed using commercially available software packages and is recommendable especially for quantitative whole body scintigraphy in a clinical setting. (orig.) [de

  20. Semi-quantitative evaluation of gallium-67 scintigraphy in lupus nephritis

    Energy Technology Data Exchange (ETDEWEB)

    Lin Wanyu [Dept. of Nuclear Medicine, Taichung Veterans General Hospital, Taichung (Taiwan); Dept. of Radiological Technology, Chung-Tai College of Medical Technology, Taichung (Taiwan); Hsieh Jihfang [Section of Nuclear Medicine, Chi-Mei Foundation Hospital, Yunk Kang City, Tainan (Taiwan); Tsai Shihchuan [Dept. of Nuclear Medicine, Show Chwan Memorial Hospital, Changhua (Taiwan); Lan Joungliang [Dept. of Internal Medicine, Taichung Veterans General Hospital, Taichung (Taiwan); Cheng Kaiyuan [Dept. of Radiological Technology, Chung-Tai College of Medical Technology, Taichung (Taiwan); Wang Shyhjen [Dept. of Nuclear Medicine, Taichung Veterans General Hospital, Taichung (Taiwan)

    2000-11-01

    Within nuclear medicine there is a trend towards quantitative analysis. Gallium renal scan has been reported to be useful in monitoring the disease activity of lupus nephritis. However, only visual interpretation using a four-grade scale has been performed in previous studies, and this method is not sensitive enough for follow-up. In this study, we developed a semi-quantitative method for gallium renal scintigraphy to find a potential parameter for the evaluation of lupus nephritis. Forty-eight patients with lupus nephritis underwent renal biopsy to determine World Health Organization classification, activity index (AI) and chronicity index (CI). A delayed 48-h gallium scan was also performed and interpreted by visual and semi-quantitative methods. For semi-quantitative analysis of the gallium uptake in both kidneys, regions of interest (ROIs) were drawn over both kidneys, the right forearm and the adjacent spine. The uptake ratios between these ROIs were calculated and expressed as the ''kidney/spine ratio (K/S ratio)'' or the ''kidney/arm ratio (K/A ratio)''. Spearman's rank correlation test and Mann-Whitney U test were used for statistical analysis. Our data showed a good correlation between the semi-quantitative gallium scan and the results of visual interpretation. K/S ratios showed a better correlation with AI than did K/A ratios. Furthermore, the left K/S ratio displayed a better correlation with AI than did the right K/S ratio. In contrast, CI did not correlate well with the results of semi-quantitative gallium scan. In conclusion, semi-quantitative gallium renal scan is easy to perform and shows a good correlation with the results of visual interpretation and renal biopsy. The left K/S ratio from semi-quantitative renal gallium scintigraphy displays the best correlation with AI and is a useful parameter in evaluating the disease activity in lupus nephritis. (orig.)

  1. Quantitative Characterization of Major Hepatic UDP-Glucuronosyltransferase Enzymes in Human Liver Microsomes: Comparison of Two Proteomic Methods and Correlation with Catalytic Activity.

    Science.gov (United States)

    Achour, Brahim; Dantonio, Alyssa; Niosi, Mark; Novak, Jonathan J; Fallon, John K; Barber, Jill; Smith, Philip C; Rostami-Hodjegan, Amin; Goosen, Theunis C

    2017-10-01

    Quantitative characterization of UDP-glucuronosyltransferase (UGT) enzymes is valuable in glucuronidation reaction phenotyping, predicting metabolic clearance and drug-drug interactions using extrapolation exercises based on pharmacokinetic modeling. Different quantitative proteomic workflows have been employed to quantify UGT enzymes in various systems, with reports indicating large variability in expression, which cannot be explained by interindividual variability alone. To evaluate the effect of methodological differences on end-point UGT abundance quantification, eight UGT enzymes were quantified in 24 matched liver microsomal samples by two laboratories using stable isotope-labeled (SIL) peptides or quantitative concatemer (QconCAT) standard, and measurements were assessed against catalytic activity in seven enzymes ( n = 59). There was little agreement between individual abundance levels reported by the two methods; only UGT1A1 showed strong correlation [Spearman rank order correlation (Rs) = 0.73, P quantitative proteomic data should be validated against catalytic activity whenever possible. In addition, metabolic reaction phenotyping exercises should consider spurious abundance-activity correlations to avoid misleading conclusions. Copyright © 2017 by The American Society for Pharmacology and Experimental Therapeutics.

  2. Two quantitative forecasting methods for macroeconomic indicators in Czech Republic

    Directory of Open Access Journals (Sweden)

    Mihaela BRATU (SIMIONESCU

    2012-03-01

    Full Text Available Econometric modelling and exponential smoothing techniques are two quantitative forecasting methods with good results in practice, but the objective of the research was to find out which of the two techniques are better for short run predictions. Therefore, for inflation, unemployment and interest rate in Czech Republic some accuracy indicators were calculated for the predictions based on these methods. Short run forecasts on a horizon of 3 months were made for December 2011-February 2012, the econometric models being updated. For Czech Republic, the exponential smoothing techniques provided more accurate forecasts than the econometric models (VAR(2 models, ARMA procedure and models with lagged variables. One explication for the better performance of smoothing techniques would be that in the chosen countries the short run predictions more influenced by the recent evolution of the indicators.

  3. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    Science.gov (United States)

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Study on a quantitative evaluation method of equipment maintenance level and plant safety level for giant complex plant system

    International Nuclear Information System (INIS)

    Aoki, Takayuki

    2010-01-01

    In this study, a quantitative method on maintenance level which is determined by the two factors, maintenance plan and field work implementation ability by maintenance crew is discussed. And also a quantitative evaluation method on safety level for giant complex plant system is discussed. As a result of consideration, the following results were obtained. (1) It was considered that equipment condition after maintenance work was determined by the two factors, maintenance plan and field work implementation ability possessed by maintenance crew. The equipment condition determined by the two factors was named as 'equipment maintenance level' and its quantitative evaluation method was clarified. (2) It was considered that CDF in a nuclear power plant, evaluated by using a failure rate counting the above maintenance level was quite different from CDF evaluated by using existing failure rates including a safety margin. Then, the former CDF was named as 'plant safety level' of plant system and its quantitative evaluation method was clarified. (3) Enhancing equipment maintenance level means an improvement of maintenance quality. That results in the enhancement of plant safety level. Therefore, plant safety level should be always watched as a plant performance indicator. (author)

  5. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  6. Assessment of Intrathecal Free Light Chain Synthesis: Comparison of Different Quantitative Methods with the Detection of Oligoclonal Free Light Chains by Isoelectric Focusing and Affinity-Mediated Immunoblotting.

    Science.gov (United States)

    Zeman, David; Kušnierová, Pavlína; Švagera, Zdeněk; Všianský, František; Byrtusová, Monika; Hradílek, Pavel; Kurková, Barbora; Zapletalová, Olga; Bartoš, Vladimír

    2016-01-01

    We aimed to compare various methods for free light chain (fLC) quantitation in cerebrospinal fluid (CSF) and serum and to determine whether quantitative CSF measurements could reliably predict intrathecal fLC synthesis. In addition, we wished to determine the relationship between free kappa and free lambda light chain concentrations in CSF and serum in various disease groups. We analysed 166 paired CSF and serum samples by at least one of the following methods: turbidimetry (Freelite™, SPAPLUS), nephelometry (N Latex FLC™, BN ProSpec), and two different (commercially available and in-house developed) sandwich ELISAs. The results were compared with oligoclonal fLC detected by affinity-mediated immunoblotting after isoelectric focusing. Although the correlations between quantitative methods were good, both proportional and systematic differences were discerned. However, no major differences were observed in the prediction of positive oligoclonal fLC test. Surprisingly, CSF free kappa/free lambda light chain ratios were lower than those in serum in about 75% of samples with negative oligoclonal fLC test. In about a half of patients with multiple sclerosis and clinically isolated syndrome, profoundly increased free kappa/free lambda light chain ratios were found in the CSF. Our results show that using appropriate method-specific cut-offs, different methods of CSF fLC quantitation can be used for the prediction of intrathecal fLC synthesis. The reason for unexpectedly low free kappa/free lambda light chain ratios in normal CSFs remains to be elucidated. Whereas CSF free kappa light chain concentration is increased in most patients with multiple sclerosis and clinically isolated syndrome, CSF free lambda light chain values show large interindividual variability in these patients and should be investigated further for possible immunopathological and prognostic significance.

  7. Full quantitative phase analysis of hydrated lime using the Rietveld method

    Energy Technology Data Exchange (ETDEWEB)

    Lassinantti Gualtieri, Magdalena, E-mail: magdalena.gualtieri@unimore.it [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Romagnoli, Marcello; Miselli, Paola; Cannio, Maria [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Gualtieri, Alessandro F. [Dipartimento di Scienze della Terra, Universita Degli Studi di Modena e Reggio Emilia, I-41100 Modena (Italy)

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  8. Full quantitative phase analysis of hydrated lime using the Rietveld method

    International Nuclear Information System (INIS)

    Lassinantti Gualtieri, Magdalena; Romagnoli, Marcello; Miselli, Paola; Cannio, Maria; Gualtieri, Alessandro F.

    2012-01-01

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2–15 wt.%.

  9. Quantitative Method to Measure Thermal Conductivity of One-Dimensional Nanostructures Based on Scanning Thermal Wave Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Kyung Bae; Chung, Jae Hun; Hwang, Gwang Seok; Jung, Eui Han; Kwon, Oh Myoung [Korea University, Seoul (Korea, Republic of)

    2014-12-15

    We present a method to quantitatively measure the thermal conductivity of one-dimensional nanostructures by utilizing scanning thermal wave microscopy (STWM) at a nanoscale spatial resolution. In this paper, we explain the principle for measuring the thermal diffusivity of one-dimensional nanostructures using STWM and the theoretical analysis procedure for quantifying the thermal diffusivity. The SWTM measurement method obtains the thermal conductivity by measuring the thermal diffusivity, which has only a phase lag relative to the distance corresponding to the transferred thermal wave. It is not affected by the thermal contact resistances between the heat source and nanostructure and between the nanostructure and probe. Thus, the heat flux applied to the nanostructure is accurately obtained. The proposed method provides a very simple and quantitative measurement relative to conventional measurement techniques.

  10. The use of Triangulation in Social Sciences Research : Can qualitative and quantitative methods be combined?

    Directory of Open Access Journals (Sweden)

    Ashatu Hussein

    2015-03-01

    Full Text Available This article refers to a study in Tanzania on fringe benefits or welfare via the work contract1 where we will work both quantitatively and qualitatively. My focus is on the vital issue of combining methods or methodologies. There has been mixed views on the uses of triangulation in researches. Some authors argue that triangulation is just for increasing the wider and deep understanding of the study phenomenon, while others have argued that triangulation is actually used to increase the study accuracy, in this case triangulation is one of the validity measures. Triangulation is defined as the use of multiple methods mainly qualitative and quantitative methods in studying the same phenomenon for the purpose of increasing study credibility. This implies that triangulation is the combination of two or more methodological approaches, theoretical perspectives, data sources, investigators and analysis methods to study the same phenomenon.However, using both qualitative and quantitative paradigms in the same study has resulted into debate from some researchers arguing that the two paradigms differ epistemologically and ontologically. Nevertheless, both paradigms are designed towards understanding about a particular subject area of interest and both of them have strengths and weaknesses. Thus, when combined there is a great possibility of neutralizing the flaws of one method and strengthening the benefits of the other for the better research results. Thus, to reap the benefits of two paradigms and minimizing the drawbacks of each, the combination of the two approaches have been advocated in this article. The quality of our studies on welfare to combat poverty is crucial, and especially when we want our conclusions to matter in practice.

  11. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  12. Evaluation of methods for oligonucleotide array data via quantitative real-time PCR.

    Science.gov (United States)

    Qin, Li-Xuan; Beyer, Richard P; Hudson, Francesca N; Linford, Nancy J; Morris, Daryl E; Kerr, Kathleen F

    2006-01-17

    There are currently many different methods for processing and summarizing probe-level data from Affymetrix oligonucleotide arrays. It is of great interest to validate these methods and identify those that are most effective. There is no single best way to do this validation, and a variety of approaches is needed. Moreover, gene expression data are collected to answer a variety of scientific questions, and the same method may not be best for all questions. Only a handful of validation studies have been done so far, most of which rely on spike-in datasets and focus on the question of detecting differential expression. Here we seek methods that excel at estimating relative expression. We evaluate methods by identifying those that give the strongest linear association between expression measurements by array and the "gold-standard" assay. Quantitative reverse-transcription polymerase chain reaction (qRT-PCR) is generally considered the "gold-standard" assay for measuring gene expression by biologists and is often used to confirm findings from microarray data. Here we use qRT-PCR measurements to validate methods for the components of processing oligo array data: background adjustment, normalization, mismatch adjustment, and probeset summary. An advantage of our approach over spike-in studies is that methods are validated on a real dataset that was collected to address a scientific question. We initially identify three of six popular methods that consistently produced the best agreement between oligo array and RT-PCR data for medium- and high-intensity genes. The three methods are generally known as MAS5, gcRMA, and the dChip mismatch mode. For medium- and high-intensity genes, we identified use of data from mismatch probes (as in MAS5 and dChip mismatch) and a sequence-based method of background adjustment (as in gcRMA) as the most important factors in methods' performances. However, we found poor reliability for methods using mismatch probes for low-intensity genes

  13. Evaluation of methods for oligonucleotide array data via quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Morris Daryl E

    2006-01-01

    Full Text Available Abstract Background There are currently many different methods for processing and summarizing probe-level data from Affymetrix oligonucleotide arrays. It is of great interest to validate these methods and identify those that are most effective. There is no single best way to do this validation, and a variety of approaches is needed. Moreover, gene expression data are collected to answer a variety of scientific questions, and the same method may not be best for all questions. Only a handful of validation studies have been done so far, most of which rely on spike-in datasets and focus on the question of detecting differential expression. Here we seek methods that excel at estimating relative expression. We evaluate methods by identifying those that give the strongest linear association between expression measurements by array and the "gold-standard" assay. Quantitative reverse-transcription polymerase chain reaction (qRT-PCR is generally considered the "gold-standard" assay for measuring gene expression by biologists and is often used to confirm findings from microarray data. Here we use qRT-PCR measurements to validate methods for the components of processing oligo array data: background adjustment, normalization, mismatch adjustment, and probeset summary. An advantage of our approach over spike-in studies is that methods are validated on a real dataset that was collected to address a scientific question. Results We initially identify three of six popular methods that consistently produced the best agreement between oligo array and RT-PCR data for medium- and high-intensity genes. The three methods are generally known as MAS5, gcRMA, and the dChip mismatch mode. For medium- and high-intensity genes, we identified use of data from mismatch probes (as in MAS5 and dChip mismatch and a sequence-based method of background adjustment (as in gcRMA as the most important factors in methods' performances. However, we found poor reliability for methods

  14. A method to forecast quantitative variables relating to nuclear public acceptance

    International Nuclear Information System (INIS)

    Ohnishi, T.

    1992-01-01

    A methodology is proposed for forecasting the future trend of quantitative variables profoundly related to the public acceptance (PA) of nuclear energy. The social environment influencing PA is first modeled by breaking it down into a finite number of fundamental elements and then the interactive formulae between the quantitative variables, which are attributed to and characterize each element, are determined by using the actual values of the variables in the past. Inputting the estimated values of exogenous variables into these formulae, the forecast values of endogenous variables can finally be obtained. Using this method, the problem of nuclear PA in Japan is treated as, for example, where the context is considered to comprise a public sector and the general social environment and socio-psychology. The public sector is broken down into three elements of the general public, the inhabitants living around nuclear facilities and the activists of anti-nuclear movements, whereas the social environment and socio-psychological factors are broken down into several elements, such as news media and psychological factors. Twenty-seven endogenous and seven exogenous variables are introduced to quantify these elements. After quantitatively formulating the interactive features between them and extrapolating the exogenous variables into the future estimates are made of the growth or attenuation of the endogenous variables, such as the pro- and anti-nuclear fractions in public opinion polls and the frequency of occurrence of anti-nuclear movements. (author)

  15. Antiadenoviral effects of N-chlorotaurine in vitro confirmed by quantitative polymerase chain reaction methods

    Directory of Open Access Journals (Sweden)

    Eiichi Uchio

    2010-11-01

    Full Text Available Eiichi Uchio1, Hirotoshi Inoue1, Kazuaki Kadonosono21Department of Ophthalmology, Fukuoka University School of Medicine, Fukuoka, Japan; 2Department of Ophthalmology, Yokohama City University Medical Center, Yokohama, JapanPurpose: Adenoviral keratoconjunctivitis is recognized as one of the major pathogens of ophthalmological nosocomial infection worldwide. N-Chlorotaurine (Cl–HN–CH2–CH2–SO3H, NCT is the N-chloro derivative of the amino acid taurine, which is an oxidant produced by human granulocytes and monocytes during inflammatory reactions. Using conventional viral plaque assay, it was previously shown that NCT causes inactivation of several human adenovirus (HAdV serotypes. In this study, we evaluated the antiadenoviral effect of NCT by quantitative polymerase chain reaction (PCR methods.Methods: A549 cells were used for viral cell culture, and HAdV serotypes 3, 4, 8, 19, and 37 were used. After calculating 50% cytotoxic concentration (CC50 of NCT by MTS (3-(4,5-dimethylthiazol-2-yl-5-(3-carboxymethoxyphenyl-2-(4-sulfophenyl-2H-tetrazolium method, HAdV was cultured with NCT for 7 days, and extracted adenoviral DNA was quantitatively measured by real-time PCR.Results: A statistically significant (P < 0.05 dose-dependent inhibition was indicated for all serotypes except HAdV type 4 (HAdV4, which was maximally inhibited by only ~50%. Among the serotypes, NCT was particularly effective against HAdV8, HAdV19a, and HAdV37. The 50% effective concentration (EC50 obtained by real-time PCR of NCT ranged between 49 and 256 µM. EC50 of NCT against HAdV3 was slightly higher than that against serotypes of species D. The selective index (CC50/EC50 ranged between 41 and 60 except for HAdV4 (11.5.Conclusions: These results show that NCT has an antiviral effect against most serotypes of human HAdV inducing keratoconjunctivitis, indicating its possible therapeutic use.Keywords: adenovirus, N-chlorotaurine, epidemic keratoconjunctivitis, antiviral

  16. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  17. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  18. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    Science.gov (United States)

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  19. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    Science.gov (United States)

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  20. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  1. Knee Kinematic Improvement After Total Knee Replacement Using a Simplified Quantitative Gait Analysis Method

    Directory of Open Access Journals (Sweden)

    Hassan Sarailoo

    2013-10-01

    Full Text Available Objectives: The aim of this study was to extract suitable spatiotemporal and kinematic parameters to determine how Total Knee Replacement (TKR alters patients’ knee kinematics during gait, using a rapid and simplified quantitative two-dimensional gait analysis procedure. Methods: Two-dimensional kinematic gait pattern of 10 participants were collected before and after the TKR surgery, using a 60 Hz camcorder in sagittal plane. Then, the kinematic parameters were extracted using the gait data. A student t-test was used to compare the group-average of spatiotemporal and peak kinematic characteristics in the sagittal plane. The knee condition was also evaluated using the Oxford Knee Score (OKS Questionnaire to ensure thateach subject was placed in the right group. Results: The results showed a significant improvement in knee flexion during stance and swing phases after TKR surgery. The walking speed was increased as a result of stride length and cadence improvement, but this increment was not statistically significant. Both post-TKR and control groups showed an increment in spatiotemporal and peak kinematic characteristics between comfortable and fast walking speeds. Discussion: The objective kinematic parameters extracted from 2D gait data were able to show significant improvements of the knee joint after TKR surgery. The patients with TKR surgery were also able to improve their knee kinematics during fast walking speed equal to the control group. These results provide a good insight into the capabilities of the presented method to evaluate knee functionality before and after TKR surgery and to define a more effective rehabilitation program.

  2. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography.

    Science.gov (United States)

    Zamir, Ehud; Kong, George Y X; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-07-01

    We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance ("E") and (2) lateral photographic temporal limbus to cornea distance ("Z"). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. A strong linear correlation was found between EZR and ACD, R = -0.91, R 2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was -0.013 mm (range -0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations.

  3. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  4. Nuclear medicine and imaging research. Instrumentation and quantitative methods of evaluation. Progress report, January 15, 1985-January 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1985-09-01

    This program of research addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation, and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. These developments are designed to meet the needs imposed by new radiopharmaceuticals developed to solve specific biomedical problems, as well as to meet the instrumentation needs associated with radiopharmaceutical production and quantitative clinical feasibility studies of the brain with PET VI. Project I addresses problems associated with the quantitative imaging of single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures. The original proposal covered work to be carried out over the three-year contract period. This report covers progress made during Year Three. 36 refs., 1 tab

  5. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    Science.gov (United States)

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  6. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  7. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    Science.gov (United States)

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  8. A Simple Linear Regression Method for Quantitative Trait Loci Linkage Analysis With Censored Observations

    OpenAIRE

    Anderson, Carl A.; McRae, Allan F.; Visscher, Peter M.

    2006-01-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using...

  9. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    Science.gov (United States)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  10. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    Science.gov (United States)

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  11. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  12. Comparison of two label-free global quantitation methods, APEX and 2D gel electrophoresis, applied to the Shigella dysenteriae proteome

    Directory of Open Access Journals (Sweden)

    Fleischmann Robert D

    2009-06-01

    Full Text Available Abstract The in vitro stationary phase proteome of the human pathogen Shigella dysenteriae serotype 1 (SD1 was quantitatively analyzed in Coomassie Blue G250 (CBB-stained 2D gels. More than four hundred and fifty proteins, of which 271 were associated with distinct gel spots, were identified. In parallel, we employed 2D-LC-MS/MS followed by the label-free computationally modified spectral counting method APEX for absolute protein expression measurements. Of the 4502 genome-predicted SD1 proteins, 1148 proteins were identified with a false positive discovery rate of 5% and quantitated using 2D-LC-MS/MS and APEX. The dynamic range of the APEX method was approximately one order of magnitude higher than that of CBB-stained spot intensity quantitation. A squared Pearson correlation analysis revealed a reasonably good correlation (R2 = 0.67 for protein quantities surveyed by both methods. The correlation was decreased for protein subsets with specific physicochemical properties, such as low Mr values and high hydropathy scores. Stoichiometric ratios of subunits of protein complexes characterized in E. coli were compared with APEX quantitative ratios of orthologous SD1 protein complexes. A high correlation was observed for subunits of soluble cellular protein complexes in several cases, demonstrating versatile applications of the APEX method in quantitative proteomics.

  13. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    Science.gov (United States)

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  14. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    International Nuclear Information System (INIS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-01-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10 -6 , comparable with the recent results reported in the literature

  15. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    Science.gov (United States)

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  16. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  17. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  18. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    Science.gov (United States)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  19. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    Science.gov (United States)

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  20. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Patriarca, Riccardo; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-01-01

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  1. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    Energy Technology Data Exchange (ETDEWEB)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-03-15

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  2. Nuclear medicine and image research: instrumentation and quantitative methods of evaluation. Comprehensive 3-year progress report, January 15, 1983-January 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1985-09-01

    This program of research addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation, and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. Project I addresses problems with the quantitative imaging a single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures

  3. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  4. Systemic Errors in Quantitative Polymerase Chain Reaction Titration of Self-Complementary Adeno-Associated Viral Vectors and Improved Alternative Methods

    Science.gov (United States)

    Fagone, Paolo; Wright, J. Fraser; Nathwani, Amit C.; Nienhuis, Arthur W.; Davidoff, Andrew M.

    2012-01-01

    Abstract Self-complementary AAV (scAAV) vector genomes contain a covalently closed hairpin derived from a mutated inverted terminal repeat that connects the two monomer single-stranded genomes into a head-to-head or tail-to-tail dimer. We found that during quantitative PCR (qPCR) this structure inhibits the amplification of proximal amplicons and causes the systemic underreporting of copy number by as much as 10-fold. We show that cleavage of scAAV vector genomes with restriction endonuclease to liberate amplicons from the covalently closed terminal hairpin restores quantitative amplification, and we implement this procedure in a simple, modified qPCR titration method for scAAV vectors. In addition, we developed and present an AAV genome titration procedure based on gel electrophoresis that requires minimal sample processing and has low interassay variability, and as such is well suited for the rigorous quality control demands of clinical vector production facilities. PMID:22428975

  5. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  6. A selective and sensitive method for quantitation of lysergic acid diethylamide (LSD) in whole blood by gas chromatography-ion trap tandem mass spectrometry.

    Science.gov (United States)

    Libong, Danielle; Bouchonnet, Stéphane; Ricordel, Ivan

    2003-01-01

    A gas chromatography-ion trap tandem mass spectrometry (GC-ion trap MS-MS) method for detection and quantitation of LSD in whole blood is presented. The sample preparation process, including a solid-phase extraction step with Bond Elut cartridges, was performed with 2 mL of whole blood. Eight microliters of the purified extract was injected with a cold on-column injection method. Positive chemical ionization was performed using acetonitrile as reagent gas; LSD was detected in the MS-MS mode. The chromatograms obtained from blood extracts showed the great selectivity of the method. GC-MS quantitation was performed using lysergic acid methylpropylamide as the internal standard. The response of the MS was linear for concentrations ranging from 0.02 ng/mL (detection threshold) to 10.0 ng/mL. Several parameters such as the choice of the capillary column, the choice of the internal standard and that of the ionization mode (positive CI vs. EI) were rationalized. Decomposition pathways under both ionization modes were studied. Within-day and between-day stability were evaluated.

  7. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  8. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  9. Development of a quantitative method for trace elements determination in ores by XRF: an application to phosphorite from Olinda (PE), Brazil

    International Nuclear Information System (INIS)

    Imakuma, K.; Sato, I.M.; Cretella Neto, J.; Costa, M.I.

    1976-01-01

    A quantitative analytical method by means of X-Ray Fluorescence intended to determine Zn, Cu and Ni trace amounts in a phosphorite ore from Olinda, PE-Brazil was established. The double dilution method with borax as the melting flux was the one choosed because ores diluted in borax in the form of melted samples show matrix effects with respect to the element to be analysed; it was possible to identify the elements already presented in the ore that caused interference in the Zn, Cu and Ni determinations. Such elements were Ca and their quantities were subsequently determined. The addition of appropriate quantities of Fe and Ca to standards allowed us to minimize the matrix effects without the undersired introduction of extraneous elements in the ore, moreover, the urge of knowing the exact amounts of Fe and Cu present in the ore drove us towards a simultaneous development of another analytical method suitable to measure medium to high contents. This method also made use of the technique of dilution with melting. These methods present advantages such as: a quantitative analysis with great reproducibility of results; the extension of the method to routine determination, to all kinds of ores. The main sources of error can be controlled, allowing an accuracy as high as +- 1 ppm for Cu, +- 4 ppm for Ni, +- 6 ppm for Zn and +- 1% for both Fe can Ca under the most unfavorable conditions

  10. PCR-free quantitative detection of genetically modified organism from raw materials – A novel electrochemiluminescence-based bio-barcode method

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.

    2018-01-01

    Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909

  11. A Quantitative Method to Screen Common Bean Plants for Resistance to Bean common mosaic necrosis virus.

    Science.gov (United States)

    Strausbaugh, C A; Myers, J R; Forster, R L; McClean, P E

    2003-11-01

    ABSTRACT A quantitative method to screen common bean (Phaseolus vulgaris) plants for resistance to Bean common mosaic necrosis virus (BCMNV) is described. Four parameters were assessed in developing the quantitative method: symptoms associated with systemic virus movement, plant vigor, virus titer, and plant dry weight. Based on these parameters, two rating systems (V and VV rating) were established. Plants from 21 recombinant inbred lines (RILs) from a Sierra (susceptible) x Olathe (partially resistant) cross inoculated with the BCMNV-NL-3 K strain were used to evaluate this quantitative approach. In all, 11 RILs exhibited very susceptible reactions and 10 RILs expressed partially resistant reactions, thus fitting a 1:1 susceptible/partially resistant ratio (chi(2) = 0.048, P = 0.827) and suggesting that the response is mediated by a single gene. Using the classical qualitative approach based only on symptom expression, the RILs were difficult to separate into phenotypic groups because of a continuum of responses. By plotting mean percent reduction in either V (based on visual symptoms) or VV (based on visual symptoms and vigor) rating versus enzyme-linked immunosorbent assay (ELISA) absorbance values, RILs could be separated clearly into different phenotypic groups. The utility of this quantitative approach also was evaluated on plants from 12 cultivars or pure lines inoculated with one of three strains of BCMNV. Using the mean VV rating and ELISA absorbance values, significant differences were established not only in cultivar and pure line comparisons but also in virus strain comparisons. This quantitative system should be particularly useful for the evaluation of the independent action of bc genes, the discovery of new genes associated with partial resistance, and assessing virulence of virus strains.

  12. Development of method quantitative content of dihydroquercetin. Report 1

    Directory of Open Access Journals (Sweden)

    Олександр Юрійович Владимиров

    2016-01-01

    Full Text Available Today is markedly increasing scientific interest in the study of flavonoids in plant objects due to their high biological activity. In this regard, the urgent task of analytical chemistry is in developing available analytical techniques of determination for flavonoids in plant objects.Aim. The aim was to develop specific techniques of quantitative determination for dihydroquercetin and determination of its validation characteristics.Methods. The technique for photocolorimetric quantification of DQW, which was based on the specific reaction of cyanidine chloride formation when added zinc powder to dihydroquercetin solution in an acidic medium has been elaborated.Results. Photocolorimetric technique of determining flavonoids recalculating on DQW has been developed, its basic validation characteristics have been determined. The obtained metrological characteristics of photocolorimetric technique for determining DQW did not exceed admissibility criteria in accordance with the requirements of the State Pharmacopoeia of Ukraine.Conclusions. By the results of statistical analysis of experimental data, it has been stated that the developed technique can be used for quantification of DQW. Metrological data obtained indicate that the method reproduced in conditions of two different laboratories with confidence probability 95 % unit value deviation was 101,85±2,54 %

  13. Secondary dentine as a sole parameter for age estimation: Comparison and reliability of qualitative and quantitative methods among North Western adult Indians

    Directory of Open Access Journals (Sweden)

    Jasbir Arora

    2016-06-01

    Full Text Available The indestructible nature of teeth against most of the environmental abuses makes its use in disaster victim identification (DVI. The present study has been undertaken to examine the reliability of Gustafson’s qualitative method and Kedici’s quantitative method of measuring secondary dentine for age estimation among North Western adult Indians. 196 (M = 85; F = 111 single rooted teeth were collected from the Department of Oral Health Sciences, PGIMER, Chandigarh. Ground sections were prepared and the amount of secondary dentine formed was scored qualitatively according to Gustafson’s (0–3 scoring system (method 1 and quantitatively following Kedici’s micrometric measurement method (method 2. Out of 196 teeth 180 samples (M = 80; F = 100 were found to be suitable for measuring secondary dentine following Kedici’s method. Absolute mean error of age was calculated by both methodologies. Results clearly showed that in pooled data, method 1 gave an error of ±10.4 years whereas method 2 exhibited an error of approximately ±13 years. A statistically significant difference was noted in absolute mean error of age between two methods of measuring secondary dentine for age estimation. Further, it was also revealed that teeth extracted for periodontal reasons severely decreased the accuracy of Kedici’s method however, the disease had no effect while estimating age by Gustafson’s method. No significant gender differences were noted in the absolute mean error of age by both methods which suggest that there is no need to separate data on the basis of gender.

  14. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    Science.gov (United States)

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  15. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  16. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  17. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations.

    Science.gov (United States)

    Wagner, Karla D; Davidson, Peter J; Pollini, Robin A; Strathdee, Steffanie A; Washburn, Rachel; Palinkas, Lawrence A

    2012-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, whilst conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors' research on HIV risk amongst injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a Needle/Syringe Exchange Program in Los Angeles, CA, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative parts

  18. Digital integrated protection system: Quantitative methods for dependability evaluation

    International Nuclear Information System (INIS)

    Krotoff, H.; Benski, C.

    1986-01-01

    The inclusion of programmed digital techniques in the SPIN system provides the used with the capability of performing sophisticated processing operations. However, it causes the quantitative evaluation of the overall failure probabilities to become somewhat more intricate by reason that: A single component may be involved in several functions; Self-tests may readily be incorporated for the purpose of monitoring the dependable operation of the equipment at all times. This paper describes the methods as implemented by MERLIN GERIN for the purpose of evaluating: The probabilities for the protective actions not to be initiated (dangerous failures); The probabilities for such protective actions to be initiated accidentally. Although the communication is focused on the programmed portion of the SPIN (UAIP) it will also deal with the evaluation performed within the scope of study works that do not exclusively cover the UAIPs

  19. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Directory of Open Access Journals (Sweden)

    M. S. Conconi

    2014-12-01

    Full Text Available The firing transformations of traditional (clay based ceramics are of technological and archeological interest, and are usually reported qualitatively or semiquantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite, the low crystalline (metakaolinite and/or spinel type pre-mullite and glassy phases evolution of a triaxial (clay-quartz-feldspar ceramic fired in a wide temperature range between 900 and 1300 ºC. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 ºC spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and

  20. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    Science.gov (United States)

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  2. A combined usage of stochastic and quantitative risk assessment methods in the worksites: Application on an electric power provider

    International Nuclear Information System (INIS)

    Marhavilas, P.K.; Koulouriotis, D.E.

    2012-01-01

    An individual method cannot build either a realistic forecasting model or a risk assessment process in the worksites, and future perspectives should focus on the combined forecasting/estimation approach. The main purpose of this paper is to gain insight into a risk prediction and estimation methodological framework, using the combination of three different methods, including the proportional quantitative-risk-assessment technique (PRAT), the time-series stochastic process (TSP), and the method of estimating the societal-risk (SRE) by F–N curves. In order to prove the usefulness of the combined usage of stochastic and quantitative risk assessment methods, an application on an electric power provider industry is presented to, using empirical data.

  3. Forecasting with quantitative methods the impact of special events in time series

    OpenAIRE

    Nikolopoulos, Konstantinos

    2010-01-01

    Abstract Quantitative methods are very successful for producing baseline forecasts of time series; however these models fail to forecast neither the timing nor the impact of special events such as promotions or strikes. In most of the cases the timing of such events is not known so they are usually referred as shocks (economics) or special events (forecasting). Sometimes the timing of such events is known a priori (i.e. a future promotion); but even then the impact of the forthcom...

  4. Methods for quantitative measurement of tooth wear using the area and volume of virtual model cusps.

    Science.gov (United States)

    Kim, Soo-Hyun; Park, Young-Seok; Kim, Min-Kyoung; Kim, Sulhee; Lee, Seung-Pyo

    2018-04-01

    Clinicians must examine tooth wear to make a proper diagnosis. However, qualitative methods of measuring tooth wear have many disadvantages. Therefore, this study aimed to develop and evaluate quantitative parameters using the cusp area and volume of virtual dental models. The subjects of this study were the same virtual models that were used in our former study. The same age group classification and new tooth wear index (NTWI) scoring system were also reused. A virtual occlusal plane was generated with the highest cusp points and lowered vertically from 0.2 to 0.8 mm to create offset planes. The area and volume of each cusp was then measured and added together. In addition to the former analysis, the differential features of each cusp were analyzed. The scores of the new parameters differentiated the age and NTWI groups better than those analyzed in the former study. The Spearman ρ coefficients between the total area and the area of each cusp also showed higher scores at the levels of 0.6 mm (0.6A) and 0.8A. The mesiolingual cusp (MLC) showed a statistically significant difference ( P <0.01) from the other cusps in the paired t -test. Additionally, the MLC exhibited the highest percentage of change at 0.6A in some age and NTWI groups. Regarding the age groups, the MLC showed the highest score in groups 1 and 2. For the NTWI groups, the MLC was not significantly different in groups 3 and 4. These results support the proposal that the lingual cusp exhibits rapid wear because it serves as a functional cusp. Although this study has limitations due to its cross-sectional nature, it suggests better quantitative parameters and analytical tools for the characteristics of cusp wear.

  5. Effects of calibration methods on quantitative material decomposition in photon-counting spectral computed tomography using a maximum a posteriori estimator.

    Science.gov (United States)

    Curtis, Tyler E; Roeder, Ryan K

    2017-10-01

    Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in

  6. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    Science.gov (United States)

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  7. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A., E-mail: clayton.pereira.silva@usp.b [Instituto de Pesquisas Energeticas e Nucleares (CQMA/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2011-07-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50{mu}g. g{sup -1} and uncertainty was shorter than 10% for the determined elements. (author)

  8. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    International Nuclear Information System (INIS)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A.

    2011-01-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50μg. g -1 and uncertainty was shorter than 10% for the determined elements. (author)

  9. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    Science.gov (United States)

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  10. Usefulness of quantitative determination of cerebral blood flow by 123I-IMP SPECT reference sample method in various cerebrovascular disorders

    International Nuclear Information System (INIS)

    Fukuda, Tadaharu; Hasegawa, Kouichi; Yamanaka, Shigehito; Hasue, Masamichi; Ohtubo, Yutaka; Wada, Atsushi; Nakanishi, Hisashi; Nakamura, Tatuya; Itou, Hiroshi.

    1992-01-01

    Cerebral blood flow (CBF) was quantitatively determined by N-isopropyl-p-[ 123 I] iodo-amphetamine (IMP) single photon emission computed tomography (SPECT) with a rotating gamma camera. A ZLC 7500 unit (SIEMENS Inc.) was used for emission CT, and a SCINTIPAK-2400 (Shimadzu Corp. Ltd.) for data processing. For the quantitative determination of CBF, arterial blood samples were collected for 5 minutes during the intravenous injection of 111 MBq of IMP, and a reference sample method corrected by time-activity curve was used. The determination was carried out in 90 patients with various cerebrovascular diseases and 5 normal volunteers. Mean cerebral blood flow (m-CBF) in the normal cases as determined by the above method was 42.4±6.0 (ml/100g/min). In patients with acute phase subarachnoid hemorrhage (SAH), severity on CT was marked in patients with intracerebral hematomas greater than 45 mm in diameter. Patients with non-hemorrhagic arteriovenous malfomation (AVM) whose nidi were 30 mm or more in diameter showed a decrease in CBF on the afferent side. This decrease was caused by a steal phenomenon affecting CBF around the AVM. The size of cerebral infarction on CT was closely correlated with the decrease in CBF, and CBF in patients with stenosis and obstruction of the main trunks was less than that in patients without them. CBF was increased by 10-20% in patients who underwent carotid endarterectomy or superior temporal artery-middle cerebral artery anastomosis for obstruction or stenosis of the internal carotid artery or the middle cerebral artery. The quantitative determination of CBF by IMP SPECT reference sample method was useful for evaluating the morbid condition and estimating the prognosis of cerebrovascular diseases, and evaluating the effects of therapy. (J.P.N.)

  11. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  12. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  13. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  14. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    Science.gov (United States)

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  15. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    Science.gov (United States)

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  16. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  17. Development and single-laboratory validation of a UHPLC-MS/MS method for quantitation of microcystins and nodularin in natural water, cyanobacteria, shellfish and algal supplement tablet powders.

    Science.gov (United States)

    Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda

    2018-02-01

    A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new

  18. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    Science.gov (United States)

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  19. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  20. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    Science.gov (United States)

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  1. Cerenkov radiation imaging as a method for quantitative measurements of beta particles in a microfluidic chip

    International Nuclear Information System (INIS)

    Cho, Jennifer S; Taschereau, Richard; Olma, Sebastian; Liu Kan; Chen Yichun; Shen, Clifton K-F; Van Dam, R Michael; Chatziioannou, Arion F

    2009-01-01

    It has been observed that microfluidic chips used for synthesizing 18 F-labeled compounds demonstrate visible light emission without nearby scintillators or fluorescent materials. The origin of the light was investigated and found to be consistent with the emission characteristics from Cerenkov radiation. Since 18 F decays through the emission of high-energy positrons, the energy threshold for beta particles, i.e. electrons or positrons, to generate Cerenkov radiation was calculated for water and polydimethylsiloxane (PDMS), the most commonly used polymer-based material for microfluidic chips. Beta particles emitted from 18 F have a continuous energy spectrum, with a maximum energy that exceeds this energy threshold for both water and PDMS. In addition, the spectral characteristics of the emitted light from 18 F in distilled water were also measured, yielding a broad distribution from 300 nm to 700 nm, with higher intensity at shorter wavelengths. A photograph of the 18 F solution showed a bluish-white light emitted from the solution, further suggesting Cerenkov radiation. In this study, the feasibility of using this Cerenkov light emission as a method for quantitative measurements of the radioactivity within the microfluidic chip in situ was evaluated. A detector previously developed for imaging microfluidic platforms was used. The detector consisted of a charge-coupled device (CCD) optically coupled to a lens. The system spatial resolution, minimum detectable activity and dynamic range were evaluated. In addition, the calibration of a Cerenkov signal versus activity concentration in the microfluidic chip was determined. This novel method of Cerenkov radiation measurements will provide researchers with a simple yet robust quantitative imaging tool for microfluidic applications utilizing beta particles.

  2. US detection and classification of hepatic disease: Comparison of quantitative algorithms with clinical readings

    International Nuclear Information System (INIS)

    Insana, M.F.; Garra, B.S.; Shawker, T.H.; Wagner, R.F.; Bradford, M.; Russell, M.A.

    1986-01-01

    A method of quantitative digital analysis of US B-scans is used to differentiate between normal and diseased liver in vivo. The tissue signature is based on five measured parameters: four describe the tissue structure and scattering properties, the fifth is the US attenuation. The patient groups studied included 31 healthy subjects, 97 patients with chronic active hepatitis, 62 with Gaucher disease, and 10 with lymphomas. Receiver operating characteristic curve analysis was used to compare the diagnostic performance of the quantitative method with the clinical reading of trained observers. The quantitative method showed greater diagnostic capability for detecting and classifying diffuse and some focal disease

  3. Determination of avermectins by the internal standard recovery correction - high performance liquid chromatography - quantitative Nuclear Magnetic Resonance method.

    Science.gov (United States)

    Zhang, Wei; Huang, Ting; Li, Hongmei; Dai, Xinhua; Quan, Can; He, Yajuan

    2017-09-01

    Quantitative Nuclear Magnetic Resonance (qNMR) is widely used to determine the purity of organic compounds. For the compounds with lower purity especially molecular weight more than 500, qNMR is at risk of error for the purity, because the impurity peaks are likely to be incompletely separated from the peak of major component. In this study, an offline ISRC-HPLC-qNMR (internal standard recovery correction - high performance liquid chromatography - qNMR) was developed to overcome this problem. It is accurate by excluding the influence of impurity; it is low-cost by using common mobile phase; and it extends the applicable scope of qNMR. In this method, a mix solution of the sample and an internal standard was separated by HPLC with common mobile phases, and only the eluents of the analyte and the internal standard were collected in the same tube. After evaporation and re-dissolution, it was determined by qNMR. A recovery correction factor was determined by comparison of the solutions before and after these procedures. After correction, the mass fraction of analyte was constant and it was accurate and precise, even though the sample loss varied during these procedures, or even in bad resolution of HPLC. Avermectin B 1 a with the purity of ~93% and the molecular weight of 873 was analyzed. Moreover, the homologues of avermectin B 1 a were determined based on the identification and quantitative analysis by tandem mass spectrometry and HPLC, and the results were consistent with the results of traditional mass balance method. The result showed that the method could be widely used for the organic compounds, and could further promote qNMR to become a primary method in the international metrological systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. PETROGRAPHY AND APPLICATION OF THE RIETVELD METHOD TO THE QUANTITATIVE ANALYSIS OF PHASES OF NATURAL CLINKER GENERATED BY COAL SPONTANEOUS COMBUSTION

    Directory of Open Access Journals (Sweden)

    Pinilla A. Jesús Andelfo

    2010-06-01

    Full Text Available

    Fine-grained and mainly reddish color, compact and slightly breccious and vesicular pyrometamorphic rocks (natural clinker are associated to the spontaneous combustion of coal seams of the Cerrejón Formation exploited by Carbones del Cerrejón Limited in La Guajira Peninsula (Caribbean Region of Colombia. These rocks constitute remaining inorganic materials derived from claystones, mudstones and sandstones originally associated with the coal and are essentially a complex mixture of various amorphous and crystalline inorganic constituents. In this paper, a petrographic characterization of natural clinker, aswell as the application of the X-ray diffraction (Rietveld method by mean of quantitative analysis of its mineral phases were carried out. The RIQAS program was used for the refinement of X ray powder diffraction profiles, analyzing the importance of using the correct isostructural models for each of the existing phases, which were obtained from the Inorganic Crystal Structure Database (ICSD. The results obtained in this investigation show that the Rietveld method can be used as a powerful tool in the quantitative analysis of phases in polycrystalline samples, which has been a traditional problem in geology.

  5. Comprehensive comparison of preselected regions for a high level radioactive waste repository: a subjective quantitative evaluation method

    International Nuclear Information System (INIS)

    Wang Ju; Zong Zihua; Jin Yuanxin; Zhu Pengfei; Su Rui; Chen Weiming

    2012-01-01

    Based on the comprehensive features of the 6 preselected regions (Northwest China, Southwest China East China, South China, Inner Mongolia, Xinjiang regions) for China's high level radioactive waste repository, this paper uses the subjective quantitative method to evaluate the weight of each site selection criterion and provides the scores of each region. The results shows that the future natural changes and the hydrogeological conditions are considered as the most important natural siting criteria, while the social impact and human activities are the most important social siting criteria. According to the scores, the priority order of the regions are Northwest China, Xinjiang, Inner Mongolia, South China, East China, Southwest China. On the whole, the scores of' the regions in western China (Northwest China, Xinjiang and Inner Mongolia) are higher than those in eastern China (South China, East China Southwest China), which obviously shows that the participated experts considers that the disposal of high level waste in west China is more favorable than in east China. (authors)

  6. What Is in Your Wallet? Quantitation of Drugs of Abuse on Paper Currency with a Rapid LC-MS/MS Method

    Science.gov (United States)

    Parker, Patrick D.; Beers, Brandon; Vergne, Matthew J.

    2017-01-01

    Laboratory experiments were developed to introduce students to the quantitation of drugs of abuse by high performance liquid chromatography-tandem mass spectrometry (LC-MS/MS). Undergraduate students were introduced to internal standard quantitation and the LC-MS/MS method optimization for cocaine. Cocaine extracted from paper currency was…

  7. Interlaboratory validation of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Takabatake, Reona; Koiwa, Tomohiro; Kasahara, Masaki; Takashima, Kaori; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Oguchi, Taichi; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    To reduce the cost and time required to routinely perform the genetically modified organism (GMO) test, we developed a duplex quantitative real-time PCR method for a screening analysis simultaneously targeting an event-specific segment for GA21 and Cauliflower Mosaic Virus 35S promoter (P35S) segment [Oguchi et al., J. Food Hyg. Soc. Japan, 50, 117-125 (2009)]. To confirm the validity of the method, an interlaboratory collaborative study was conducted. In the collaborative study, conversion factors (Cfs), which are required to calculate the GMO amount (%), were first determined for two real-time PCR instruments, the ABI PRISM 7900HT and the ABI PRISM 7500. A blind test was then conducted. The limit of quantitation for both GA21 and P35S was estimated to be 0.5% or less. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)). The determined bias and RSD(R) were each less than 25%. We believe the developed method would be useful for the practical screening analysis of GM maize.

  8. NNAlign: A Web-Based Prediction Method Allowing Non-Expert End-User Discovery of Sequence Motifs in Quantitative Peptide Data

    DEFF Research Database (Denmark)

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity...... to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs...... associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can...

  9. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  10. Quantitative measurement of cerebral blood flow on patients with early syphilis

    International Nuclear Information System (INIS)

    Zhong Jijun; Wu Jinchang; Yang Yi; Tang Jun; Liu Zengli; Shi Xin

    2005-01-01

    To study quantitative change of cerebral blood flow (CBF) on patients with early syphilis, the authors have established a method on absolute measurement of rCBF by using SPECT with Ethyl Cysteinate Dimmer (ECD) as imaging agent, and the method was applied to measure rCBF on patients with early syphilis. The rCBF values measured by this method are highly consistent with the values measured by other classical methods such as SPECT ( 123 I-IMP) and PET( 15 O-H 2 O). The rCBF values for early syphilis patients and the normal control show some statistical differences. A routine quantitative absolute measurement of rCBF featured with simple procedures is therefore on the way of maturation. (authors)

  11. Quantitative cerebral H215O perfusion PET without arterial blood sampling, a method based on washout rate

    International Nuclear Information System (INIS)

    Treyer, Valerie; Jobin, Mathieu; Burger, Cyrill; Buck, Alfred; Teneggi, Vincenzo

    2003-01-01

    The quantitative determination of regional cerebral blood flow (rCBF) is important in certain clinical and research applications. The disadvantage of most quantitative methods using H 2 15 O positron emission tomography (PET) is the need for arterial blood sampling. In this study a new non-invasive method for rCBF quantification was evaluated. The method is based on the washout rate of H 2 15 O following intravenous injection. All results were obtained with Alpert's method, which yields maps of the washin parameter K 1 (rCBF K1 ) and the washout parameter k 2 (rCBF k2 ). Maps of rCBF K1 were computed with measured arterial input curves. Maps of rCBF k2* were calculated with a standard input curve which was the mean of eight individual input curves. The mean of grey matter rCBF k2* (CBF k2* ) was then compared with the mean of rCBF K1 (CBF K1 ) in ten healthy volunteer smokers who underwent two PET sessions on day 1 and day 3. Each session consisted of three serial H 2 15 O scans. Reproducibility was analysed using the rCBF difference scan 3-scan 2 in each session. The perfusion reserve (PR = rCBF acetazolamide -rCBF baseline ) following acetazolamide challenge was calculated with rCBF k2* (PR k2* ) and rCBF K1 (PR K1 ) in ten patients with cerebrovascular disease. The difference CBF k2* -CBF K1 was 5.90±8.12 ml/min/100 ml (mean±SD, n=55). The SD of the scan 3-scan 1 difference was 6.1% for rCBF k2* and rCBF K1 , demonstrating a high reproducibility. Perfusion reserve values determined with rCBF K1 and rCBF k2* were in high agreement (difference PR k2* -PR K1 =-6.5±10.4%, PR expressed in percentage increase from baseline). In conclusion, a new non-invasive method for the quantitative determination of rCBF is presented. The method is in good agreement with Alpert's original method and the reproducibility is high. It does not require arterial blood sampling, yields quantitative voxel-by-voxel maps of rCBF, and is computationally efficient and easy to implement

  12. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    Science.gov (United States)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  13. Quantitative X-ray methods of amorphous content and crystallinity determination of SiO2, in Quartz and Opal mixture

    International Nuclear Information System (INIS)

    Ketabdari, M.R.; Ahmadi, K.; Esmaeilnia Shirvani, A.; Tofigh, A.

    2001-01-01

    X-ray diffraction technique is commonly used for qualitative analysis of minerals, and has also been successfully used for quantitative measurements. In this research, the matrix flushing and a new X-ray diffraction method have been used for the determination of crystallinity and amorphous content of Opal and Quartz mixture. The PCAPD is used to determine the quantitative analysis of these two minerals

  14. Fundamental and clinical studies on simultaneous, quantitative analysis of hepatobiliary and gastrointestinal scintigrams using double isotope method

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Y; Kakihara, M; Sasaki, M; Tabuse, Y; Takei, N [Wakayama Medical Coll. (Japan)

    1981-04-01

    Double isotope method was applied to carry out simultaneous and quantitative analysis of hepatobiliary and gastrointestinal scintigrams. A scinticamera with parallel collimator for medium energy was connected to a computer to distinguish the two isotopes at a time. 4mCi of sup(99m)Tc-(Sn)-pyridoxylideneisoleucine (Tc-PI) and 200 ..mu..Ci of /sup 111/In-diethylenetriaminepentaacetic acid (In-DTPA) were administrated by i.v. injection and per oral, respectively. Three normal (two women and a man) and 16 patients after the operation of gastric cancer (10 recovered by Roux-en Y method after the total gastrectomy, and 6 recovered after the operation replacing the jejunum between the esophagus and duodenum) were investigated. The process of bile secretion and its mixing with food were followed by the scanning quantitatively. The analysis of time-activity variation at each organ indicated that the replacing operation gave more physiological recovery than that by Roux-en Y method. This method is noninvasive to patients and is promising to follow the process or activity of digestion in any digestive organ after surgery.

  15. ExSTA: External Standard Addition Method for Accurate High-Throughput Quantitation in Targeted Proteomics Experiments.

    Science.gov (United States)

    Mohammed, Yassene; Pan, Jingxi; Zhang, Suping; Han, Jun; Borchers, Christoph H

    2018-03-01

    Targeted proteomics using MRM with stable-isotope-labeled internal-standard (SIS) peptides is the current method of choice for protein quantitation in complex biological matrices. Better quantitation can be achieved with the internal standard-addition method, where successive increments of synthesized natural form (NAT) of the endogenous analyte are added to each sample, a response curve is generated, and the endogenous concentration is determined at the x-intercept. Internal NAT-addition, however, requires multiple analyses of each sample, resulting in increased sample consumption and analysis time. To compare the following three methods, an MRM assay for 34 high-to-moderate abundance human plasma proteins is used: classical internal SIS-addition, internal NAT-addition, and external NAT-addition-generated in buffer using NAT and SIS peptides. Using endogenous-free chicken plasma, the accuracy is also evaluated. The internal NAT-addition outperforms the other two in precision and accuracy. However, the curves derived by internal vs. external NAT-addition differ by only ≈3.8% in slope, providing comparable accuracies and precision with good CV values. While the internal NAT-addition method may be "ideal", this new external NAT-addition can be used to determine the concentration of high-to-moderate abundance endogenous plasma proteins, providing a robust and cost-effective alternative for clinical analyses or other high-throughput applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Methods and instrumentation for quantitative microchip capillary electrophoresis

    NARCIS (Netherlands)

    Revermann, T.

    2007-01-01

    The development of novel instrumentation and analytical methodology for quantitative microchip capillary electrophoresis (MCE) is described in this thesis. Demanding only small quantities of reagents and samples, microfluidic instrumentation is highly advantageous. Fast separations at high voltages

  17. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  18. A method to quantitate regional wall motion in left ventriculography using Hildreth algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Terashima, Mikio [Hyogo Red Cross Blood Center (Japan); Naito, Hiroaki; Sato, Yoshinobu; Tamura, Shinichi; Kurosawa, Tsutomu

    1998-06-01

    Quantitative measurement of ventricular wall motion is indispensable for objective evaluation of cardiac function associated with coronary artery disease. We have modified the Hildreth`s algorithm to estimate excursions of the ventricular wall on left ventricular images yielded by various imaging techniques. Tagging cine-MRI was carried out on 7 healthy volunteers. The original Hildreth method, the modified Hildreth method and the centerline method were applied to the outlines of the images obtained, to estimate excursion of the left ventricular wall and regional shortening and to evaluate the accuracy of these methods when measuring these parameters, compared to the values of these parameters measured actually using the attached tags. The accuracy of the original Hildreth method was comparable to that of the centerline method, while the modified Hildreth method was significantly more accurate than the centerline method (P<0.05). Regional shortening as estimated using the modified Hildreth method differed less from the actually measured regional shortening than did the shortening estimated using the centerline method (P<0.05). The modified Hildreth method allowed reasonable estimation of left ventricular wall excursion in all cases where it was applied. These results indicate that when applied to left ventriculograms for ventricular wall motion analysis, the modified Hildreth method is more useful than the original Hildreth method. (author)

  19. Proceedings First Workshop on Quantitative Formal Methods : theory and applications (QFM'09, Eindhoven, The Netherlands, November 3, 2009)

    NARCIS (Netherlands)

    Andova, S.; McIver, A.; D'Argenio, P.R.; Cuijpers, P.J.L.; Markovski, J.; Morgan, C.; Núñez, M.

    2009-01-01

    This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted

  20. Can't Count or Won't Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment.

    Science.gov (United States)

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2016-06-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through 'doing' quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a 'magic bullet' and that a wider programme of content and assessment diversification across the curriculum is preferential.

  1. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    Science.gov (United States)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  2. Comparation of fundamental analytical methods for quantitative determination of copper(IIion

    Directory of Open Access Journals (Sweden)

    Ačanski Marijana M.

    2008-01-01

    Full Text Available Copper is a ductile metal with excellent electrical conductivity, and finds extensive use as an electrical conductor, heat conductor, as a building material, and as a component of various alloys. In this work accuracy of methods for quantitative determination (gravimetric and titrimetric methods of analysis of copper(II ion was studied. Gravimetric methods do not require a calibration or standardization step (as all other analytical procedures except coulometry do because the results are calculated directly from the experimental data and molar masses. Thus, when only one or two samples are to be analyzed, a gravimetric procedure may be the method of choice because it involves less time and effort than a procedure that requires preparation of standards and calibration. In this work in gravimetric analysis the concentration of copper(II ion is established through the measurement of a mass of CuSCN and CuO. Titrimetric methods is a process in which a standard reagent is added to a solution of an analyze until the reaction between the analyze and reagent is judged to be complete. In this work in titrimetric analysis the concentration of copper(II ion is established through the measurement of a volume of different standard reagents: Km, Na2S2O3 and AgNO3. Results were discussed individually and mutually with the aspect of exactility, reproductivity and rapidity. Relative error was calculated for all methods.

  3. Two quantitative methods for assessment of [Tc-99m]-MAA in the lung in the treatment of liver metastases: a case study

    International Nuclear Information System (INIS)

    Willowson, Kathy P.; Bailey, Dale L.; Baldock, Clive

    2009-01-01

    Full text: Objective: The use of Y-90 microspheres to treat metastatic liver cancer is becoming widely utilized. Despite the fact that the microspheres are delivered directly to the liver, some activity may bypass the liver capillaries and be shunted to the lungs. To evaluate the percentage of pulmonary breakthrough, a pre-therapy test is performed using Tc- 9 9 m labeled spheres. The aim of this project was to compare two quantitative methods for assessing lung uptake, and consider the possibility of organ specific quantification. Method: A previously validated method for achieving CT-based quantitative SPECTI was compared to a simple planar approach. A 44 year old man suffering from metastatic liver sarcoma was referred to the clinic for pre-therapy evaluation. After injection of Tc- 9 9 m labeled microspheres and routine imaging, a SPECT/CT was acquired and specific organ uptake values calculated. A further calibrated injection of [Tc- 9 9 m ]-MAA was then given as a simplified alternative to quantify lung uptake by comparing pre and post counts. Results: The quantitative SPECT/CT method correctly accounted for all injected activity and found 80% of the dose was retained in the liver and 4% in the lungs. The planar method found -4% of the dose in the lungs. Conclusion: The quantitative technique we have developed allows for accurate calculation of organ specific uptake, which has important implications for treatment. The additional MAA injection offers a simplified but accurate method to quantify lung uptake. I. K Willow son, D.L Bailey and C Baldock (2008) Quantitative SPECT reconstructions using CT-derived corrections Phys Med Bioi 53:3099-3112.

  4. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  5. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    Directory of Open Access Journals (Sweden)

    Marcello Manfredi

    2014-07-01

    Full Text Available In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  6. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  7. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  8. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    Science.gov (United States)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  9. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  10. SU-F-J-112: Clinical Feasibility Test of An RF Pulse-Based MRI Method for the Quantitative Fat-Water Segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Yee, S; Wloch, J; Pirkola, M [William Beaumont Hospital, Royal Oak, MI (United States)

    2016-06-15

    Purpose: Quantitative fat-water segmentation is important not only because of the clinical utility of fat-suppressed MRI images in better detecting lesions of clinical significance (in the midst of bright fat signal) but also because of the possible physical need, in which CT-like images based on the materials’ photon attenuation properties may have to be generated from MR images; particularly, as in the case of MR-only radiation oncology environment to obtain radiation dose calculation or as in the case of hybrid PET/MR modality to obtain attenuation correction map for the quantitative PET reconstruction. The majority of such fat-water quantitative segmentations have been performed by utilizing the Dixon’s method and its variations, which have to enforce the proper settings (often predefined) of echo time (TE) in the pulse sequences. Therefore, such methods have been unable to be directly combined with those ultrashort TE (UTE) sequences that, taking the advantage of very low TE values (∼ 10’s microsecond), might be beneficial to directly detect bones. Recently, an RF pulse-based method (http://dx.doi.org/10.1016/j.mri.2015.11.006), termed as PROD pulse method, was introduced as a method of quantitative fat-water segmentation that does not have to depend on predefined TE settings. Here, the clinical feasibility of this method is verified in brain tumor patients by combining the PROD pulse with several sequences. Methods: In a clinical 3T MRI, the PROD pulse was combined with turbo spin echo (e.g. TR=1500, TE=16 or 60, ETL=15) or turbo field echo (e.g. TR=5.6, TE=2.8, ETL=12) sequences without specifying TE values. Results: The fat-water segmentation was possible without having to set specific TE values. Conclusion: The PROD pulse method is clinically feasible. Although not yet combined with UTE sequences in our laboratory, the method is potentially compatible with UTE sequences, and thus, might be useful to directly segment fat, water, bone and air.

  11. Simultaneous quantitative analysis of main components in linderae reflexae radix with one single marker.

    Science.gov (United States)

    Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing

    2016-05-08

    Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.

  12. Quantifying social norms: by coupling the ecosystem management concept and semi-quantitative sociological methods

    Science.gov (United States)

    Zhang, D.; Xu, H.

    2012-12-01

    Over recent decades, human-induced environmental changes have steadily and rapidly grown in intensity and impact to where they now often exceed natural impacts. As one of important components of human activities, social norms play key roles in environmental and natural resources management. But the lack of relevant quantitative data about social norms greatly limits our scientific understanding of the complex linkages between humans and nature, and hampers our solving of pressing environmental and social problems. In this study, we built a quantified method by coupling the ecosystem management concept, semi-quantitative sociological methods and mathematical statistics. We got the quantified value of social norms from two parts, whether the content of social norms coincide with the concept of ecosystem management (content value) and how about the performance after social norms were put into implementation (implementation value) . First, we separately identified 12 core elements of ecosystem management and 16 indexes of social norms, and then matched them one by one. According to their matched degree, we got the content value of social norms. Second, we selected 8 key factors that can represent the performance of social norms after they were put into implementation, and then we got the implementation value by Delph method. Adding these two parts values, we got the final value of each social norms. Third, we conducted a case study in Heihe river basin, the second largest inland river in China, by selecting 12 official edicts related to the river basin ecosystem management of Heihe River Basin. By doing so, we first got the qualified data of social norms which can be directly applied to the research that involved observational or experimental data collection of natural processes. Second, each value was supported by specific contents, so it can assist creating a clear road map for building or revising management and policy guidelines. For example, in this case study

  13. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-01-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  14. Can’t Count or Won’t Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment

    Science.gov (United States)

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2015-01-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through ‘doing’ quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a ‘magic bullet’ and that a wider programme of content and assessment diversification across the curriculum is preferential. PMID:27330225

  15. Rapid and simple method for quantitative evaluation of neurocytotoxic effects of radiation on developing medaka brain

    International Nuclear Information System (INIS)

    Yasuda, Takako; Maeda, Keiko; Matsumoto, Atsuko; Maruyama, Kouichi; Ishikawa, Yuji; Yoshimoto, Masami

    2008-01-01

    We describe a novel method for rapid and quantitative evaluation of the degree of radiation-induced apoptosis in the developing brain of medaka (Oryzias latipes). Embryos at stage 28 were irradiated with 1, 2, 3.5, and 5 Gy x-ray. Living embryos were stained with a vital dye, acridine orange (AO), for 1-2 h, and whole-mount brains were examined under an epifluorescence microscope. From 7 to 10 h after irradiation with 5 Gy x-ray, we found two morphologically different types of AO-stained structures, namely, small single nuclei and rosette-shaped nuclear clusters. Electron microscopy revealed that these two distinct types of structures were single apoptotic cells with condensed nuclei and aggregates of apoptotic cells, respectively. From 10 to 30 h after irradiation, a similar AO-staining pattern was observed. The numbers of AO-stained rosette-shaped nuclear clusters and AO-stained single nuclei increased in a dose-dependent manner in the optic tectum. We used the number of AO-stained rosette-shaped nuclear clusters/optic tectum as an index of the degree of radiation-induced brain cell death at 20-24 h after irradiation. The results showed that the number of rosette-shaped nuclear clusters/optic tectum in irradiated embryos exposed to 2 Gy or higher doses was highly significant compared to the number in nonirradiated control embryos, whereas no difference was detected at 1 Gy. Thus, the threshold dose for brain cell death in medaka embryos was taken as being between 1-2 Gy, which may not be so extraordinarily large compared to those for rodents and humans. The results show that medaka embryos are useful for quantitative evaluation of developmental neurocytotoxic effects of radiation. (author)

  16. Quantitative SPECT reconstruction for brain distribution with a non-uniform attenuation using a regularizing method

    International Nuclear Information System (INIS)

    Soussaline, F.; Bidaut, L.; Raynaud, C.; Le Coq, G.

    1983-06-01

    An analytical solution to the SPECT reconstruction problem, where the actual attenuation effect can be included, was developped using a regularizing iterative method (RIM). The potential of this approach in quantitative brain studies when using a tracer for cerebrovascular disorders is now under evaluation. Mathematical simulations for a distributed activity in the brain surrounded by the skull and physical phantom studies were performed, using a rotating camera based SPECT system, allowing the calibration of the system and the evaluation of the adapted method to be used. On the simulation studies, the contrast obtained along a profile, was less than 5%, the standard deviation 8% and the quantitative accuracy 13%, for a uniform emission distribution of mean = 100 per pixel and a double attenuation coefficient of μ = 0.115 cm -1 and 0.5 cm -1 . Clinical data obtained after injection of 123 I (AMPI) were reconstructed using the RIM without and with cerebrovascular diseases or lesion defects. Contour finding techniques were used for the delineation of the brain and the skull, and measured attenuation coefficients were assumed within these two regions. Using volumes of interest, selected on homogeneous regions on an hemisphere and reported symetrically, the statistical uncertainty for 300 K events in the tomogram was found to be 12%, the index of symetry was of 4% for normal distribution. These results suggest that quantitative SPECT reconstruction for brain distribution is feasible, and that combined with an adapted tracer and an adequate model physiopathological parameters could be extracted

  17. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  18. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  19. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  20. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    Science.gov (United States)

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  1. Handling large numbers of observation units in three-way methods for the analysis of qualitative and quantitative two-way data

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Marchetti, G.M.

    1994-01-01

    Recently, a number of methods have been proposed for the exploratory analysis of mixtures of qualitative and quantitative variables. In these methods for each variable an object by object similarity matrix is constructed, and these are consequently analyzed by means of three-way methods like

  2. Quantitative SPECT brain imaging: Effects of attenuation and detector response

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Bowsher, J.E.; Turkington, T.G.; Liang, Z.; Greer, K.L.; Coleman, R.E.

    1993-01-01

    Two physical factors that substantially degrade quantitative accuracy in SPECT imaging of the brain are attenuation and detector response. In addition to the physical factors, random noise in the reconstructed image can greatly affect the quantitative measurement. The purpose of this work was to implement two reconstruction methods that compensate for attenuation and detector response, a 3D maximum likelihood-EM method (ML) and a filtered backprojection method (FB) with Metz filter and Chang attenuation compensation, and compare the methods in terms of quantitative accuracy and image noise. The methods were tested on simulated data of the 3D Hoffman brain phantom. The simulation incorporated attenuation and distance-dependent detector response. Bias and standard deviation of reconstructed voxel intensities were measured in the gray and white matter regions. The results with ML showed that in both the gray and white matter regions as the number of iterations increased, bias decreased and standard deviation increased. Similar results were observed with FB as the Metz filter power increased. In both regions, ML had smaller standard deviation than FB for a given bias. Reconstruction times for the ML method have been greatly reduced through efficient coding, limited source support, and by computing attenuation factors only along rays perpendicular to the detector

  3. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  4. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    Science.gov (United States)

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  5. The Advantages and Disadvantages of Using Qualitative and Quantitative Approaches and Methods in Language "Testing and Assessment" Research: A Literature Review

    Science.gov (United States)

    Rahman, Md Shidur

    2017-01-01

    The researchers of various disciplines often use qualitative and quantitative research methods and approaches for their studies. Some of these researchers like to be known as qualitative researchers; others like to be regarded as quantitative researchers. The researchers, thus, are sharply polarised; and they involve in a competition of pointing…

  6. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  7. Semi-quantitative prediction of a multiple API solid dosage form with a combination of vibrational spectroscopy methods.

    Science.gov (United States)

    Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T

    2016-05-30

    Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    Science.gov (United States)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  10. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    Science.gov (United States)

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  11. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    Science.gov (United States)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  12. Validation of HPLC method for the simultaneous and quantitative determination of 12 UV-filters in cosmetics.

    Science.gov (United States)

    Nyeborg, M; Pissavini, M; Lemasson, Y; Doucet, O

    2010-02-01

    The aim of the study was the validation of a high-performance liquid chromatography (HPLC) method for the simultaneous and quantitative determination of twelve commonly used organic UV-filters (phenylbenzimidazole sulfonic acid, benzophenone-3, isoamyl p-methoxycinnamate, diethylamino hydroxybenzoyl hexyl benzoate, octocrylene, ethylhexyl methoxycinnamate, ethylhexyl salicylate, butyl methoxydibenzoylmethane, diethylhexyl butamido triazone, ethylhexyl triazone, methylene bis-benzotriazolyl tetramethylbutylphenol and bis-ethylhexyloxyphenol methoxyphenyl triazine) contained in suncare products. The separation and quantitative determination was performed in <30 min, using a Symmetry Shield(R) C18 (5 microm) column from Waters and a mobile phase (gradient mode) consisting of ethanol and acidified water. UV measurements were carried out at multi-wavelengths, according to the absorption of the analytes.

  13. [Study on ethnic medicine quantitative reference herb,Tibetan medicine fruits of Capsicum frutescens as a case].

    Science.gov (United States)

    Zan, Ke; Cui, Gan; Guo, Li-Nong; Ma, Shuang-Cheng; Zheng, Jian

    2018-05-01

    High price and difficult to get of reference substance have become obstacles to HPLC assay of ethnic medicine. A new method based on quantitative reference herb (QRH) was proposed. Specific chromatograms in fruits of Capsicum frutescens were employed to determine peak positions, and HPLC quantitative reference herb was prepared from fruits of C. frutescens. The content of capsaicin and dihydrocapsaicin in the quantitative control herb was determined by HPLC. Eleven batches of fruits of C. frutescens were analyzed with quantitative reference herb and reference substance respectively. The results showed no difference. The present method is feasible for quality control of ethnic medicines and quantitative reference herb is suitable to replace reference substances in assay. Copyright© by the Chinese Pharmaceutical Association.

  14. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  15. THE STUDY OF SOCIAL REPRESENTATIONS BY THE VIGNETTE METHOD: A QUANTITATIVE INTERPRETATION

    Directory of Open Access Journals (Sweden)

    Ж В Пузанова

    2017-12-01

    Full Text Available The article focuses on the prospects of creating vignettes as a new method in empirical sociology. It is a good alternative to the conventional mass survey methods. The article consists of a few sections differing by the focus. The vignette method is not popular among Russian scientists, but has a big history abroad. A wide range of problems can be solved by this method (e.g. the prospects for guardianship and its evaluation, international students’ adaptation to the educational system, social justice studies, market-ing and business research, etc.. The vignette method can be used for studying different problems including sensitive questions (e.g. HIV, drugs, psychological trauma, etc., because it is one of the projective techniques. Projective techniques allow to obtain more reliable information, because the respondent projects one situation on the another, but at the same time responds to his own stimulus. The article considers advantages and disadvantages of the method. The authors provide information about the limitations of the method. The article presents the key principles for designing and developing the vignettes method depending on the research type. The authors provide examples of their own vignettes tested in the course of their own empirical research. The authors highlight the advantages of the logical-combinatorial approaches (especially the JSM method with its dichotomy for the analysis of data in quantitative research. Also they consider another method of the analysis of the data that implies the technique of “steeping”, i.e. when the respondent gets new information step by step, which extends his previous knowledge.

  16. A new method for quantitative assessment of resilience engineering by PCA and NT approach: A case study in a process industry

    International Nuclear Information System (INIS)

    Shirali, Gh.A.; Mohammadfam, I.; Ebrahimipour, V.

    2013-01-01

    In recent years, resilience engineering (RE) has attracted widespread interest from industry as well as academia because it presents a new way of thinking about safety and accident. Although the concept of RE was defined scholarly in various areas, there are only few which specifically focus on how to measure RE. Therefore, there is a gap in assessing resilience by quantitative methods. This research aimed at presenting a new method for quantitative assessment of RE using questionnaire and based on principal component analysis. However, six resilience indicators, i.e., top management commitment, Just culture, learning culture, awareness and opacity, preparedness, and flexibility were chosen, and the data related to those in the 11 units of a process industry using a questionnaire was gathered. The data was analyzed based on principal component analysis (PCA) approach. The analysis also leads to determination of the score of resilience indicators and the process units. The process units were ranked using these scores. Consequently, the prescribed approach can determine the poor indicators and the process units. This is the first study that considers a quantitative assessment in RE area which is conducted through PCA. Implementation of the proposed methods would enable the managers to recognize the current weaknesses and challenges against the resilience of their system. -- Highlights: •We quantitatively measure the potential of resilience. •The results are more tangible to understand and interpret. •The method facilitates comparison of resilience state among various process units. •The method facilitates comparison of units' resilience state with the best practice

  17. A novel quantitative analysis method of three-dimensional fluorescence spectra for vegetable oils contents in edible blend oil

    Science.gov (United States)

    Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei

    2015-04-01

    Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.

  18. Quantitative Image Restoration in Bright Field Optical Microscopy.

    Science.gov (United States)

    Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús

    2017-11-07

    Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. Frozen history : limitations and possibilities of quantitative diffusion studies

    NARCIS (Netherlands)

    Mom, G.P.A.; Albert de la Bruhèze, A.A.; Oldenziel, R.

    2009-01-01

    In this contribution I have tried to show how quantitative methods can generate new questions and thus support historical research. It can be concluded that re ducing historical complexity through forcing reality into the shape of a diffusion curve does not seem to be the preferable strategy for

  20. Quantitative analysis of charge trapping and classification of sub-gap states in MoS2 TFT by pulse I-V method

    Science.gov (United States)

    Park, Junghak; Hur, Ji-Hyun; Jeon, Sanghun

    2018-04-01

    The threshold voltage instabilities and huge hysteresis of MoS2 thin film transistors (TFTs) have raised concerns about their practical applicability in next-generation switching devices. These behaviors are associated with charge trapping, which stems from tunneling to the adjacent trap site, interfacial redox reaction and interface and/or bulk trap states. In this report, we present quantitative analysis on the electron charge trapping mechanism of MoS2 TFT by fast pulse I-V method and the space charge limited current (SCLC) measurement. By adopting the fast pulse I-V method, we were able to obtain effective mobility. In addition, the origin of the trap states was identified by disassembling the sub-gap states into interface trap and bulk trap states by simple extraction analysis. These measurement methods and analyses enable not only quantitative extraction of various traps but also an understanding of the charge transport mechanism in MoS2 TFTs. The fast I-V data and SCLC data obtained under various measurement temperatures and ambient show that electron transport to neighboring trap sites by tunneling is the main charge trapping mechanism in thin-MoS2 TFTs. This implies that interfacial traps account for most of the total sub-gap states while the bulk trap contribution is negligible, at approximately 0.40% and 0.26% in air and vacuum ambient, respectively. Thus, control of the interface trap states is crucial to further improve the performance of devices with thin channels.

  1. Statistical equivalent of the classical TDT for quantitative traits and ...

    Indian Academy of Sciences (India)

    sion model to test the association for quantitative traits based on a trio design. We show that the method ... from the analyses and only one transmission is considered. Keywords. .... use the sample mean or median of Y, as an estimator of c in.

  2. [Comparison of two quantitative methods of endobronchial ultrasound real-time elastography for evaluating intrathoracic lymph nodes].

    Science.gov (United States)

    Mao, X W; Yang, J Y; Zheng, X X; Wang, L; Zhu, L; Li, Y; Xiong, H K; Sun, J Y

    2017-06-12

    Objective: To compare the clinical value of two quantitative methods in analyzing endobronchial ultrasound real-time elastography (EBUS-RTE) images for evaluating intrathoracic lymph nodes. Methods: From January 2014 to April 2014, EBUS-RTE examination was performed in patients who received EBUS-TBNA examination in Shanghai Chest Hospital. Each intrathoracic lymph node had a selected EBUS-RTE image. Stiff area ratio and mean hue value of region of interest (ROI) in each image were calculated respectively. The final diagnosis of lymph node was based on the pathologic/microbiologic results of EBUS-TBNA, pathologic/microbiologic results of other examinations and clinical following-up. The sensitivity, specificity, positive predictive value, negative predictive value and accuracy were evaluated for distinguishing malignant and benign lesions. Results: Fifty-six patients and 68 lymph nodes were enrolled in this study, of which 35 lymph nodes were malignant and 33 lymph nodes were benign. The stiff area ratio and mean hue value of benign and malignant lesions were 0.32±0.29, 0.62±0.20 and 109.99±28.13, 141.62±17.52, respectively, and statistical differences were found in both of those two methods ( t =-5.14, P methods can be used for analyzing EBUS-RTE images quantitatively, having the value of differentiating benign and malignant intrathoracic lymph nodes, and the stiff area ratio is better than the mean hue value between the two methods.

  3. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    Science.gov (United States)

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  4. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    Science.gov (United States)

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  5. Validation of quantitative {sup 1}H NMR method for the analysis of pharmaceutical formulations; Validacao de metodo quantitativo por RMN de {sup 1}H para analises de formulacoes farmaceuticas

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maiara da S. [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Instituto de Quimica; Colnago, Luiz Alberto, E-mail: luiz.colnago@embrapa.br [Embrapa Instrumentacao, Sao Carlos, SP (Brazil)

    2013-09-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of Superscript-One H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  6. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  7. A quantitative method for zoning of protected areas and its spatial ecological implications.

    Science.gov (United States)

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  8. A method to prioritize quantitative traits and individuals for sequencing in family-based studies.

    Directory of Open Access Journals (Sweden)

    Kaanan P Shah

    Full Text Available Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT, which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI Heart study (p = 0.044, consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants.

  9. The quantitative evaluation of 201Tl myocardial scintigraphy using reinjection method

    International Nuclear Information System (INIS)

    Naruse, Hitoshi; Itano, Midoriko; Yamamoto, Juro; Morita, Masato; Fukutake, Naoshige; Kawamoto, Hideo; Ohyanagi, Mitsumasa; Iwasaki, Tadaaki; Fukuchi, Minoru

    1993-01-01

    This study was designed to determine whether Tl-201 myocardial scintigraphy using reinjection method would improve the rate of redistribution (RD) and, if improved, which would contribute to RD improvement, extent or severity of ischemia shown by the bolls-eye view method. In 17 patients with ischemic heart disease, exercise Tl-201 myocardial images were acquired at 10 min (early images) and 180 min (delayed images) after intravenous injection of 74 MBq of TlCl. In addition, 37 MBq of TlCl was injected again after delayed imaging and then images were acquired (RI images). Among the 17 patients, 7 were judged as RD(+), 8 as RD(-), and 2 as undefined. In 8 RD(-) patients and 2 undefined patients, RD became (+) on RI images. Visual changes in extent and severity of ischemia from early to delayed images were 68±42 for RD(+) cases vs. 3±20 for RD(-) cases and 0.4±0.8 for RD(+) cases vs. 0.1±0.3 for RD(-) cases, respectively. The corresponding figures from delayed to RI images for extent and score of ischemia were 50±46 for RD(+) cases vs. 13±22 for RD(-) cases and 0.4±30.3 for RD(+) cases vs. 0.1±0.5 for RD(-) cases, respectively. For 5 patients undergoing coronary revascularization, extent was improved in all cases, but severity was improved in only some cases. In conclusion, when RD became (+) on RI images, myocardial viability seemed to have been underestimated. Quantitative evaluation revealed that RD improved from early to delayed images depended on extent and that RD improved from delayed to RI images depended on both extent and severity. In postoperative improvement of RD, extent of ischemia was mainly involved. RI imaging was found to compensate for the underestimation of RD. Quantitative evaluation was also useful in the observation of subtle changes of ischemia. (N.K.)

  10. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    Science.gov (United States)

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Determination of γ-rays emitting radionuclides in surface water: application of a quantitative biosensing method

    International Nuclear Information System (INIS)

    Wolterbeek, H. Th.; Van der Meer, A. J. G. M.

    1995-01-01

    A quantitative biosensing method has been developed for the determination of γ-rays emitting radionuclides in surface water. The method is based on the concept that at equilibrium the specific radioactivity in the biosensor is equal to the specific radioactivity in water. The method consists of the measurement of both the radionuclide and the related stable isotope (element) in the biosensor and the determination of the element in water. This three-way analysis eliminates problems such as unpredictable biosensor behaviour, effects of water elemental composition or further abiotic parameters on accumulation levels: what remains is the generally high enrichment (bioaccumulation factor BCF) of elements and radionuclides in the biosensor material. Using water plants, the method is shown to be three to five orders of magnitude more sensitive than the direct analysis of water. (author)

  12. Determination of {gamma}-rays emitting radionuclides in surface water: application of a quantitative biosensing method

    Energy Technology Data Exchange (ETDEWEB)

    Wolterbeek, H Th; Van der Meer, A. J. G. M. [Delft University of Technology, Interfaculty Reactor Institute, Mekelweg 15, 2629 JB Delft (Netherlands)

    1995-12-01

    A quantitative biosensing method has been developed for the determination of {gamma}-rays emitting radionuclides in surface water. The method is based on the concept that at equilibrium the specific radioactivity in the biosensor is equal to the specific radioactivity in water. The method consists of the measurement of both the radionuclide and the related stable isotope (element) in the biosensor and the determination of the element in water. This three-way analysis eliminates problems such as unpredictable biosensor behaviour, effects of water elemental composition or further abiotic parameters on accumulation levels: what remains is the generally high enrichment (bioaccumulation factor BCF) of elements and radionuclides in the biosensor material. Using water plants, the method is shown to be three to five orders of magnitude more sensitive than the direct analysis of water. (author)

  13. Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.

    Science.gov (United States)

    Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R

    2014-08-01

    To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.

  14. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    Science.gov (United States)

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively.

  15. Quantitative and regional evaluation methods for lung scintigraphs

    International Nuclear Information System (INIS)

    Fichter, J.

    1982-01-01

    For the evaluation of perfusion lung scintigraphs with regard to the quantitative valuation and also with regard to the choice of the regions new criteria were presented. In addition to the usual methods of sectioning each lung lobe into upper, middle and lower level and the determination of the per cent activity share of the total activity the following values were established: the median of the activity distribution and the differences of the per cent counting rate as well as of the median of the corresponding regions of the right and left lung. The individual regions should describe the functional structures (lobe and segment structure). A corresponding computer program takes over the projection of lobe and segment regions in a simplified form onto the scintigraph with consideration of individual lung stretching. With the help of a clinical study on 60 patients and 18 control persons with 99mTc-MAA and 133 Xe-gas lung scintigraphs the following results could be determined: depending on the combination of the 32 parameters available for evaluation and the choice of regions between 4 and 20 of the 60 patients were falsely negatively classified and 1 to 2 of the 18 controls were falsely positive. The accuracy of the Tc-scintigraph proved to be better. All together using the best possible parameter combinations comparative results were attained. (TRV) [de

  16. Quantitative analysis of active compounds in pharmaceutical preparations by use of attenuated total-reflection Fourier transform mid-infrared spectrophotometry and the internal standard method.

    Science.gov (United States)

    Sastre Toraño, J; van Hattum, S H

    2001-10-01

    A new method is presented for the quantitative analysis of compounds in pharmaceutical preparations Fourier transform (FT) mid-infrared (MIR) spectroscopy with an attenuated total reflection (ATR) module. Reduction of the quantity of overlapping absorption bands, by interaction of the compound of interest with an appropriate solvent, and the employment of an internal standard (IS), makes MIR suitable for quantitative analysis. Vigabatrin, as active compound in vigabatrin 100-mg capsules, was used as a model compound for the development of the method. Vigabatrin was extracted from the capsule content with water after addition of a sodium thiosulfate IS solution. The extract was concentrated by volume reduction and applied to the FTMIR-ATR module. Concentrations of unknown samples were calculated from the ratio of the vigabatrin band area (1321-1610 cm(-1)) and the IS band area (883-1215 cm(-1)) using a calibration standard. The ratio of the area of the vigabatrin peak to that of the IS was linear with the concentration in the range of interest (90-110 mg, in twofold; n=2). The accuracy of the method in this range was 99.7-100.5% (n=5) with a variability of 0.4-1.3% (n=5). The comparison of the presented method with an HPLC assay showed similar results; the analysis of five vigabatrin 100-mg capsules resulted in a mean concentration of 102 mg with a variation of 2% with both methods.

  17. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    International Nuclear Information System (INIS)

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D.

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity

  18. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    International Nuclear Information System (INIS)

    Ryan, C.G.; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-01-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  19. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  20. Qualitative and Quantitative Features Evaluation of Two Methods of Sugarcane Harvesting (with aim of Energy and Sugar Production

    Directory of Open Access Journals (Sweden)

    K Andekaeizadeh

    2018-03-01

    Full Text Available Introduction Sugarcane is an important plant in the world that cultivate for the production of sugar and energy. For this purpose, evaluation of Sugarcane (SC and Energycane (EC methods is necessary. Energy is vital for economic and social development and the demand for it is rising. The international community look toward alternative to fossil fuels is the aim of using liquid fuel derived from agricultural resources. According to calculations, about 47% from renewable energy sources in Brazil comes from sugarcane so as, the country is known the second largest source of renewable energy. Sugarcane in Brazil provides about 17.5% of primary energy sources. Material such as bagasse and ethanol are derived from sugarcane that provide 4.2% and 11.2 % consumed energy, respectively . In developing countries, the use of this product increase in order to achieve self-sufficiency in the production of starch and sugar and thus independence in bioethanol production. Evaluation of energy consumption in manufacturing systems, show the measurement method of yield conversion to the amount of energy. Many of products of Sugarcane have ability to produce bioenergy. Many materials obtain from sugarcane such as, cellulosic ethanol, biofuels and other chemical materials. Hence, Energycane is introduced as a new method of sugarcane harvesting. But, one of the problems of this method is high cost and high energy consumption of harvester. So that the total cost of Energycane method is 38.4 percent of production total costs, whereas, this cost, in Sugarcane method is 5.32 percent of production total costs. In a study that was conducted by Matanker et al (2014 with title “Power requirements and field performance in harvesting EC and SC”, the power requirements of some components of sugarcane harvester and its field capacity, in Sugarcane and Energycane methods were examined. The consumed power by basecutter, elevator and chopper was measured in terms of Mega grams

  1. A method for the quantitative metallographic analysis of nuclear fuels (Programme QMA)

    International Nuclear Information System (INIS)

    Moreno, A.; Sari, C.

    1978-01-01

    A method is described for the quantitative analysis of features such as voids, cracks, phases, inclusions and grains distributed on random plane sections of fuel materials. An electronic image analyzer, Quantimet, attached to a MM6 Leitz microscope was used to measure size, area, perimeter and shape of features dispersed in a matrix. The apparatus is driven by a computer which calculates the size, area and perimeter distribution, form factors and orientation of the features as well as the inclusion content of the matrix expressed in weight per cent. A computer programme, QMA, executes the spatial correction of the measured two-dimensional sections and delivers the true distribution of feature sizes in a three-dimensional system

  2. Quantitative assessment of cerebral blood flow employing the Rutland method using N-isopropyl-(123I)-p-iodoamphetamine

    International Nuclear Information System (INIS)

    Aoki, Shigeru; Kitano, Tokio.

    1992-01-01

    The purpose of this study is to evaluate clinical usefulness as the quantitative assessment of the cerebral blood flow by venous sampling supersede arterial sampling using N-isopropyl-( 123 I)-p-iodoamphetamine (IMP) and single-photon emission computed tomography. The method employed Ono's report using the Rutland method. The mean value of total cerebral blood flow by arterial sampling was 388.0±79.4 (standard deviation) ml/min, and that by venous sampling was 448.5±110.3 ml/min. The value of temporal lobe, including basal ganglia where there was no evidence of neurological, electroencephalographic, and other imaging findings was 44.0±3.8 ml/100g/min using arterial sampling and 49.7±6.0 ml/100g/min using venous sampling. (n=22, mean age 57.6) The value of cases with poor peripheral circulation varied a great deal between the arterial and venous samplings. There was a good correlation between arterial and venous samplings in 76 patients without poor peripheral circulation. In conclusion, this method is available for noninvasive quantitative assessment of the cerebral blood flow in patients without poor peripheral circulation. (author)

  3. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  4. Quantitative in situ magnetization reversal studies in Lorentz microscopy and electron holography

    International Nuclear Information System (INIS)

    Rodríguez, L.A.; Magén, C.; Snoeck, E.; Gatel, C.; Marín, L.; Serrano-Ramón, L.

    2013-01-01

    A generalized procedure for the in situ application of magnetic fields by means of the excitation of the objective lens for magnetic imaging experiments in Lorentz microscopy and electron holography is quantitatively described. A protocol for applying magnetic fields with arbitrary in-plane magnitude and orientation is presented, and a freeware script for Digital Micrograph ™ is provided to assist the operation of the microscope. Moreover, a method to accurately reconstruct hysteresis loops is detailed. We show that the out-of-plane component of the magnetic field cannot be always neglected when performing quantitative measurements of the local magnetization. Several examples are shown to demonstrate the accuracy and functionality of the methods. - Highlights: • Generalized procedure for application of magnetic fields with the TEM objective lens. • Arbitrary in-plane magnetic field magnitude and orientation can be applied. • Method to accurately reconstruct hysteresis loops by electron holography. • Out-of-plane field component should be considered in quantitative measurements. • Examples to illustrate the method in Lorentz microscopy and electron holography

  5. Selection of Suitable DNA Extraction Methods for Genetically Modified Maize 3272, and Development and Evaluation of an Event-Specific Quantitative PCR Method for 3272.

    Science.gov (United States)

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.

  6. Quantitative analysis of calcined fertilizers by X-ray diffraction patterns

    International Nuclear Information System (INIS)

    Cekinski, E.

    1987-01-01

    An X-ray diffraction pattern method for quantitative analyses of phosphate fertilizers obtained by calcination of a misture of Anitapolis phosphate concentrate and sodium carbonate is described. The method consists in plotting a calibration curve, using spinel (MgAl 2 O 4 ) as internal standard, of the phases that were formed by calcination, sintetized in laboratory. The tests conducted in order to avail the method accuracy showed good correlation between the obtained data and the real values. (author) [pt

  7. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    Science.gov (United States)

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  8. Application of multivariable analysis methods to the quantitative detection of gas by tin dioxide micro-sensors; Application des methodes d'analyse multivariables a la detection quantitative de gaz par microcapteurs a base de dioxyde d'etain

    Energy Technology Data Exchange (ETDEWEB)

    Perdreau, N.

    2000-01-17

    The electric conductivity of tin dioxide depends on the temperature of the material and on the nature and environment of the surrounding gas. This work shows that the treatment by multivariable analysis methods of electric conductance signals of one sensor allows to determine concentrations of binary or ternary mixtures of ethanol (0-80 ppm), carbon monoxide (0-300 ppm) and methane (0-1000 ppm). A part of this study has consisted of the design and the implementation of an automatic testing bench allowing to acquire the electric conductance of four sensors in thermal cycle and under gaseous cycles. It has also revealed some disturbing effects (humidity,..) of the measurement. Two techniques of sensor fabrication have been used to obtain conductances (depending of temperature) distinct for each gas, reproducible for the different sensors and enough stable with time to allow the exploitation of the signals by multivariable analysis methods (tin dioxide under the form of thin layers obtained by reactive evaporation or under the form of sintered powder bars). In a last part, it has been shown that the quantitative determination of gas by the application of chemo-metry methods is possible although the relation between the electric conductances in one part and the temperatures and concentrations in another part is non linear. Moreover, the modelling with the 'Partial Least Square' method and a pretreatment allows to obtain performance data comparable to those obtained with neural networks. (O.M.)

  9. Quantitative prediction process and evaluation method for seafloor polymetallic sulfide resources

    Directory of Open Access Journals (Sweden)

    Mengyi Ren

    2016-03-01

    Full Text Available Seafloor polymetallic sulfide resources exhibit significant development potential. In 2011, China received the exploration rights for 10,000 km2 of a polymetallic sulfides area in the Southwest Indian Ocean; China will be permitted to retain only 25% of the area in 2021. However, an exploration of seafloor hydrothermal sulfide deposits in China remains in the initial stage. According to the quantitative prediction theory and the exploration status of seafloor sulfides, this paper systematically proposes a quantitative prediction evaluation process of oceanic polymetallic sulfide resources and divides it into three stages: prediction in a large area, prediction in the prospecting region, and the verification and evaluation of targets. The first two stages of the prediction process have been employed in seafloor sulfides prospecting of the Chinese contract area. The results of stage one suggest that the Chinese contract area is located in the high posterior probability area, which indicates good prospecting potential area in the Indian Ocean. In stage two, the Chinese contract area of 48°–52°E has the highest posterior probability value, which can be selected as the reserved region for additional exploration. In stage three, the method of numerical simulation is employed to reproduce the ore-forming process of sulfides to verify the accuracy of the reserved targets obtained from the three-stage prediction. By narrowing the exploration area and gradually improving the exploration accuracy, the prediction will provide a basis for the exploration and exploitation of seafloor polymetallic sulfide resources.

  10. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  11. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    Directory of Open Access Journals (Sweden)

    Ahmed R

    2014-03-01

    Full Text Available Rafay Ahmed,1 Matthew J Oborski,2 Misun Hwang,1 Frank S Lieberman,3 James M Mountz11Department of Radiology, 2Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA; 3Department of Neurology and Department of Medicine, Division of Hematology/Oncology, University of Pittsburgh School of Medicine, Pittsburgh, PA, USAAbstract: Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies

  12. Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure

    Directory of Open Access Journals (Sweden)

    Jaeyong Yee

    2015-01-01

    Full Text Available A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait.

  13. An attempt to use immunohistochemical methods for semi-quantitative determination of surfactant in bronchial secretion after hyperbaric exposures

    Directory of Open Access Journals (Sweden)

    Piotr Siermontowski

    2015-12-01

    Full Text Available Background The most significant index of pulmonary oxygen toxicity is a decrease in vital capacity (VC dependent on the duration of exposure and partial pressure of oxygen. The only method to measure this decrease is spirometry performed directly after exposure. Objective The aim of the study was to check whether the extent of lung damage could be assessed by quantitative determination of pulmonary surfactant in bronchial secretion. Design Sputum samples were collected before, during and after hyperbaric air or oxygen exposures; histological preparations were prepared and stained immunohistochemically to visualize surfactant. Amongst 781 samples collected, only 209 contained sputum and only 126 were included in the study. In this group, only 64 preparations could be paired for comparison. Results The semi-quantitative method used and statistical findings have not demonstrated any significance. Conclusions The method suggested for assessing the extent of lung damage has been found unsuitable for practical use due to difficulties in obtaining the proper material; moreover, the study findings do not allow to draw conclusions concerning its effectiveness.

  14. System Establishment and Method Application for Quantitatively Evaluating the Green Degree of the Products in Green Public Procurement

    Directory of Open Access Journals (Sweden)

    Shengguo Xu

    2016-09-01

    Full Text Available The government green purchase is widely considered to be an effective means of promoting sustainable consumption. However, how to identify the greener product is the biggest obstacle of government green purchase and it has not been well solved. A quantitative evaluation method is provided to measure the green degree of different products of the same use function with an indicator system established, which includes fundamental indicators, general indicators, and leading indicators. It can clearly show the products’ green extent by rating the scores of different products, which provides the government a tool to compare the green degree of different products and select greener ones. A comprehensive evaluation case of a project purchasing 1635 desk computers in Tianjin government procurement center is conducted using the green degree evaluation system. The environmental performance of the products were assessed quantitatively, and the evaluation price, which was the bid price minus the discount (the discount rate was according to the total scores attained by their environmental performance, and the final evaluation price ranking from low to high in turn is supplier C, D, E, A, and B. The winner, supplier C, was not the lowest bid price or the best environmental performance, but it performed well at both bid price and environmental performance so it deserved the project. It shows that the green extent evaluation system can help classify the different products by evaluating their environment performance including structure and connection technology, selection of materials and marks, prolonged use, hazardous substances, energy consumption, recyclability rate, etc. and price, so that it could help to choose the greener products.

  15. Quantitative analysis of patients with celiac disease by video capsule endoscopy: A deep learning method.

    Science.gov (United States)

    Zhou, Teng; Han, Guoqiang; Li, Bing Nan; Lin, Zhizhe; Ciaccio, Edward J; Green, Peter H; Qin, Jing

    2017-06-01

    Celiac disease is one of the most common diseases in the world. Capsule endoscopy is an alternative way to visualize the entire small intestine without invasiveness to the patient. It is useful to characterize celiac disease, but hours are need to manually analyze the retrospective data of a single patient. Computer-aided quantitative analysis by a deep learning method helps in alleviating the workload during analysis of the retrospective videos. Capsule endoscopy clips from 6 celiac disease patients and 5 controls were preprocessed for training. The frames with a large field of opaque extraluminal fluid or air bubbles were removed automatically by using a pre-selection algorithm. Then the frames were cropped and the intensity was corrected prior to frame rotation in the proposed new method. The GoogLeNet is trained with these frames. Then, the clips of capsule endoscopy from 5 additional celiac disease patients and 5 additional control patients are used for testing. The trained GoogLeNet was able to distinguish the frames from capsule endoscopy clips of celiac disease patients vs controls. Quantitative measurement with evaluation of the confidence was developed to assess the severity level of pathology in the subjects. Relying on the evaluation confidence, the GoogLeNet achieved 100% sensitivity and specificity for the testing set. The t-test confirmed the evaluation confidence is significant to distinguish celiac disease patients from controls. Furthermore, it is found that the evaluation confidence may also relate to the severity level of small bowel mucosal lesions. A deep convolutional neural network was established for quantitative measurement of the existence and degree of pathology throughout the small intestine, which may improve computer-aided clinical techniques to assess mucosal atrophy and other etiologies in real-time with videocapsule endoscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    Science.gov (United States)

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  17. A quantitative approach to evolution of music and philosophy

    International Nuclear Information System (INIS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira Jr, Osvaldo N; Costa, Luciano da Fontoura

    2012-01-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master–apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic. (paper)

  18. Optimized slice-selective 1H NMR experiments combined with highly accurate quantitative 13C NMR using an internal reference method

    Science.gov (United States)

    Jézéquel, Tangi; Silvestre, Virginie; Dinis, Katy; Giraudeau, Patrick; Akoka, Serge

    2018-04-01

    Isotope ratio monitoring by 13C NMR spectrometry (irm-13C NMR) provides the complete 13C intramolecular position-specific composition at natural abundance. It represents a powerful tool to track the (bio)chemical pathway which has led to the synthesis of targeted molecules, since it allows Position-specific Isotope Analysis (PSIA). Due to the very small composition range (which represents the range of variation of the isotopic composition of a given nuclei) of 13C natural abundance values (50‰), irm-13C NMR requires a 1‰ accuracy and thus highly quantitative analysis by 13C NMR. Until now, the conventional strategy to determine the position-specific abundance xi relies on the combination of irm-MS (isotopic ratio monitoring Mass Spectrometry) and 13C quantitative NMR. However this approach presents a serious drawback since it relies on two different techniques and requires to measure separately the signal of all the carbons of the analyzed compound, which is not always possible. To circumvent this constraint, we recently proposed a new methodology to perform 13C isotopic analysis using an internal reference method and relying on NMR only. The method combines a highly quantitative 1H NMR pulse sequence (named DWET) with a 13C isotopic NMR measurement. However, the recently published DWET sequence is unsuited for samples with short T1, which forms a serious limitation for irm-13C NMR experiments where a relaxing agent is added. In this context, we suggest two variants of the DWET called Multi-WET and Profiled-WET, developed and optimized to reach the same accuracy of 1‰ with a better immunity towards T1 variations. Their performance is evaluated on the determination of the 13C isotopic profile of vanillin. Both pulse sequences show a 1‰ accuracy with an increased robustness to pulse miscalibrations compared to the initial DWET method. This constitutes a major advance in the context of irm-13C NMR since it is now possible to perform isotopic analysis with high

  19. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  20. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  1. A new method of quantitative cavitation assessment in the field of a lithotripter.

    Science.gov (United States)

    Jöchle, K; Debus, J; Lorenz, W J; Huber, P

    1996-01-01

    Transient cavitation seems to be a very important effect regarding the interaction of pulsed high-energy ultrasound with biologic tissues. Using a newly developed laser optical system we are able to determine the life-span of transient cavities (relative error less than +/- 5%) in the focal region of a lithotripter (Lithostar, Siemens). The laser scattering method is based on the detection of scattered laser light reflected during a bubble's life. This method requires no sort of sensor material in the pathway of the sound field. Thus, the method avoids any interference with bubble dynamics during the measurement. The knowledge of the time of bubble decay allows conclusions to be reached on the destructive power of the cavities. By combining the results of life-span measurements with the maximum bubble radius using stroboscopic photographs we found that the measured time of bubble decay and the predicted time using Rayleigh's law only differs by about 13% even in the case of complex bubble fields. It can be shown that the laser scattering method is feasible to assess cavitation events quantitatively. Moreover, it will enable us to compare different medical ultrasound sources that have the capability to generate cavitation.

  2. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  3. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    Science.gov (United States)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  4. VerSi. A method for the quantitative comparison of repository systems

    Energy Technology Data Exchange (ETDEWEB)

    Kaempfer, T.U.; Ruebel, A.; Resele, G. [AF-Consult Switzerland Ltd, Baden (Switzerland); Moenig, J. [GRS Braunschweig (Germany)

    2015-07-01

    Decision making and design processes for radioactive waste repositories are guided by safety goals that need to be achieved. In this context, the comparison of different disposal concepts can provide relevant support to better understand the performance of the repository systems. Such a task requires a method for a traceable comparison that is as objective as possible. We present a versatile method that allows for the comparison of different disposal concepts in potentially different host rocks. The condition for the method to work is that the repository systems are defined to a comparable level including designed repository structures, disposal concepts, and engineered and geological barriers which are all based on site-specific safety requirements. The method is primarily based on quantitative analyses and probabilistic model calculations regarding the long-term safety of the repository systems under consideration. The crucial evaluation criteria for the comparison are statistical key figures of indicators that characterize the radiotoxicity flux out of the so called containment-providing rock zone (einschlusswirksamer Gebirgsbereich). The key figures account for existing uncertainties with respect to the actual site properties, the safety relevant processes, and the potential future impact of external processes on the repository system, i.e., they include scenario-, process-, and parameter-uncertainties. The method (1) leads to an evaluation of the retention and containment capacity of the repository systems and its robustness with respect to existing uncertainties as well as to potential external influences; (2) specifies the procedures for the system analyses and the calculation of the statistical key figures as well as for the comparative interpretation of the key figures; and (3) also gives recommendations and sets benchmarks for the comparative assessment of the repository systems under consideration based on the key figures and additional qualitative

  5. Efficiency of peracetic acid in inactivating bacteria, viruses, and spores in water determined with ATP bioluminescence, quantitative PCR, and culture-based methods.

    Science.gov (United States)

    Park, Eunyoung; Lee, Cheonghoon; Bisesi, Michael; Lee, Jiyoung

    2014-03-01

    The disinfection efficiency of peracetic acid (PAA) was investigated on three microbial types using three different methods (filtration-based ATP (adenosine-triphosphate) bioluminescence, quantitative polymerase chain reaction (qPCR), culture-based method). Fecal indicator bacteria (Enterococcus faecium), virus indicator (male-specific (F(+)) coliphages (coliphages)), and protozoa disinfection surrogate (Bacillus subtilis spores (spores)) were tested. The mode of action for spore disinfection was visualized using scanning electron microscopy. The results indicated that PAA concentrations of 5 ppm (contact time: 5 min), 50 ppm (10 min), and 3,000 ppm (5 min) were needed to achieve 3-log reduction of E. faecium, coliphages, and spores, respectively. Scanning electron microscopy observation showed that PAA targets the external layers of spores. The lower reduction rates of tested microbes measured with qPCR suggest that qPCR may overestimate the surviving microbes. Collectively, PAA showed broad disinfection efficiency (susceptibility: E. faecium > coliphages > spores). For E. faecium and spores, ATP bioluminescence was substantially faster (∼5 min) than culture-based method (>24 h) and qPCR (2-3 h). This study suggests PAA as an effective alternative to inactivate broad types of microbial contaminants in water. Together with the use of rapid detection methods, this approach can be useful for urgent situations when timely response is needed for ensuring water quality.

  6. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    Science.gov (United States)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  7. Quantitative Determination of Pole Figures with a Texture Goniometer by the Reflection Method

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, Manfred

    1962-03-15

    For different slit systems of a modern texture goniometer (type Siemens) the X-ray intensity reflected from textureless plane samples has been measured as function of the tilt angle {phi} and Bragg angle {theta}. The intensity curves obtained generally enable quantitative and almost complete pole figure determinations to be made with only one reflection recording, even for materials with high line density. Investigations on rolled uranium sheet with CuK{sub {alpha}} radiation showed that for reliable chart records up to {phi} {approx} 70 deg on reflections with an angular separation of only {delta}(2{theta}) = 0.7 deg, the vertical receiving slit must be limited to at least 1 mm when using a horizontal main slit of 0.5 mm, Though in this case the intensity drop off resulting from defocusing from the flat sample surface is considerable even at small tilt angles, a correction of intensity is possible also at large angles within an accuracy of {+-} 5 %. Moreover, different pole figures for one material can be compared quantitatively, without constant slit settings and recording conditions being necessary, if the intensity values of the contour lines are always referred to the background radiation.

  8. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  9. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    Science.gov (United States)

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry.

    Science.gov (United States)

    Anguera, M Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J

    2017-01-01

    Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use-and enormous potential-of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings.

  11. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2017-12-01

    Full Text Available Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use—and enormous potential—of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings.

  12. SAFER, an Analysis Method of Quantitative Proteomic Data, Reveals New Interactors of the C. elegans Autophagic Protein LGG-1.

    Science.gov (United States)

    Yi, Zhou; Manil-Ségalen, Marion; Sago, Laila; Glatigny, Annie; Redeker, Virginie; Legouis, Renaud; Mucchielli-Giorgi, Marie-Hélène

    2016-05-06

    Affinity purifications followed by mass spectrometric analysis are used to identify protein-protein interactions. Because quantitative proteomic data are noisy, it is necessary to develop statistical methods to eliminate false-positives and identify true partners. We present here a novel approach for filtering false interactors, named "SAFER" for mass Spectrometry data Analysis by Filtering of Experimental Replicates, which is based on the reproducibility of the replicates and the fold-change of the protein intensities between bait and control. To identify regulators or targets of autophagy, we characterized the interactors of LGG1, a ubiquitin-like protein involved in autophagosome formation in C. elegans. LGG-1 partners were purified by affinity, analyzed by nanoLC-MS/MS mass spectrometry, and quantified by a label-free proteomic approach based on the mass spectrometric signal intensity of peptide precursor ions. Because the selection of confident interactions depends on the method used for statistical analysis, we compared SAFER with several statistical tests and different scoring algorithms on this set of data. We show that SAFER recovers high-confidence interactors that have been ignored by the other methods and identified new candidates involved in the autophagy process. We further validated our method on a public data set and conclude that SAFER notably improves the identification of protein interactors.

  13. Use of quantitative SPECT/CT reconstruction in 99mTc-sestamibi imaging of patients with renal masses.

    Science.gov (United States)

    Jones, Krystyna M; Solnes, Lilja B; Rowe, Steven P; Gorin, Michael A; Sheikhbahaei, Sara; Fung, George; Frey, Eric C; Allaf, Mohamad E; Du, Yong; Javadi, Mehrbod S

    2018-02-01

    Technetium-99m ( 99m Tc)-sestamibi single-photon emission computed tomography/computed tomography (SPECT/CT) has previously been shown to allow for the accurate differentiation of benign renal oncocytomas and hybrid oncocytic/chromophobe tumors (HOCTs) apart from other malignant renal tumor histologies, with oncocytomas/HOCTs showing high uptake and renal cell carcinoma (RCC) showing low uptake based on uptake ratios from non-quantitative single-photon emission computed tomography (SPECT) reconstructions. However, in this study, several tumors fell close to the uptake ratio cutoff, likely due to limitations in conventional SPECT/CT reconstruction methods. We hypothesized that application of quantitative SPECT/CT (QSPECT) reconstruction methods developed by our group would provide more robust separation of hot and cold lesions, serving as an imaging framework on which quantitative biomarkers can be validated for evaluation of renal masses with 99m Tc-sestamibi. Single-photon emission computed tomography data were reconstructed using the clinical Flash 3D reconstruction and QSPECT methods. Two blinded readers then characterized each tumor as hot or cold. Semi-quantitative uptake ratios were calculated by dividing lesion activity by background renal activity for both Flash 3D and QSPECT reconstructions. The difference between median (mean) hot and cold tumor uptake ratios measured 0.655 (0.73) with the QSPECT method and 0.624 (0.67) with the conventional method, resulting in increased separation between hot and cold tumors. Sub-analysis of 7 lesions near the separation point showed a higher absolute difference (0.16) between QPSECT and Flash 3D mean uptake ratios compared to the remaining lesions. Our finding of improved separation between uptake ratios of hot and cold lesions using QSPECT reconstruction lays the foundation for additional quantitative SPECT techniques such as SPECT-UV in the setting of renal 99m Tc-sestamibi and other SPECT/CT exams. With robust

  14. Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization

    Science.gov (United States)

    Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija

    2017-07-01

    Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.

  15. Development and application of a quantitative method based on LC-QqQ MS/MS for determination of steviol glycosides in Stevia leaves.

    Science.gov (United States)

    Molina-Calle, M; Sánchez de Medina, V; Delgado de la Torre, M P; Priego-Capote, F; Luque de Castro, M D

    2016-07-01

    Stevia is a currently well-known plant thanks to the presence of steviol glycosides, which are considered as sweeteners obtained from a natural source. In this research, a method based on LC-MS/MS by using a triple quadrupole detector was developed for quantitation of 8 steviol glycosides in extracts from Stevia leaves. The ionization and fragmentation parameters for selected reaction monitoring were optimized. Detection and quantitation limits ranging from 0.1 to 0.5ng/mL and from 0.5 to 1ng/mL, respectively, were achieved: the lowest attained so far. The steviol glycosides were quantified in extracts from leaves of seven varieties of Stevia cultivated in laboratory, greenhouse and field. Plants cultivated in field presented higher concentration of steviol glycosides than those cultivated in greenhouse. Thus, the way of cultivation clearly influences the concentration of these compounds. The inclusion of branches together with leaves as raw material was also evaluated, showing that this inclusion modifies, either positively or negatively, the concentration of steviol glycosides. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A Quantitative Method to Monitor Reactive Oxygen Species Production by Electron Paramagnetic Resonance in Physiological and Pathological Conditions

    Science.gov (United States)

    Mrakic-Sposta, Simona; Gussoni, Maristella; Montorsi, Michela; Porcelli, Simone; Vezzoli, Alessandra

    2014-01-01

    The growing interest in the role of Reactive Oxygen Species (ROS) and in the assessment of oxidative stress in health and disease clashes with the lack of consensus on reliable quantitative noninvasive methods applicable. The study aimed at demonstrating that a recently developed Electron Paramagnetic Resonance microinvasive method provides direct evidence of the “instantaneous” presence of ROS returning absolute concentration levels that correlate with “a posteriori” assays of ROS-induced damage by means of biomarkers. The reliability of the choice to measure ROS production rate in human capillary blood rather than in plasma was tested (step I). A significant (P < 0.01) linear relationship between EPR data collected on capillary blood versus venous blood (R 2 = 0.95), plasma (R 2 = 0.82), and erythrocytes (R 2 = 0.73) was found. Then (step II) ROS production changes of various subjects' categories, young versus old and healthy versus pathological at rest condition, were found significantly different (range 0.0001–0.05 P level). The comparison of the results with antioxidant capacity and oxidative damage biomarkers concentrations showed that all changes indicating increased oxidative stress are directly related to ROS production increase. Therefore, the adopted method may be an automated technique for a lot of routine in clinical trials. PMID:25374651

  17. The evaluation of a novel method comparing quantitative light-induced fluorescence (QLF) with spectrophotometry to assess staining and bleaching of teeth

    NARCIS (Netherlands)

    Adeyemi, A.A.; Jarad, F.D.; de Josselin de Jong, E.; Pender, N.; Higham, S.M.

    2010-01-01

    This study reports the development and evaluation of a novel method using quantitative light-induced fluorescence (QLF), which enables its use for quantifying and assessing whole tooth surface staining and tooth whitening. The method was compared with a spectrophotometer to assess reliability. Two

  18. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  19. Mixed methods research for TESOL

    CERN Document Server

    Brown, James; Farr, Fiona

    2014-01-01

    Defining and discussing the relevance of theoretical and practical issues involved in mixed methods research. Covering the basics of research methodology, this textbook shows you how to choose and combine quantitative and qualitative research methods to b

  20. Biodistribution study of [I-123] ADAM in mice brain using quantitative autoradiography

    International Nuclear Information System (INIS)

    Lin, K.J.; Yen, T.C.; Tzen, K.Y.; Ye, X.X.; Hwang, J.J.; Wey, S.P.; Ting, G.

    2002-01-01

    Aim: Autoradiography with radioluminography is a delicate method to characterize newly developed radiotracers and to apply them to pharmacological studies. Herein, we reported a biodistribution result of [I-123] ADAM (2-((2-((dimethylamino)methyl)phenyl)thio)-5- iodophenylamine) in mice brain quantitatively using imaging plates. Materials and Methods: 1mCi [I-123] ADAM was injected into male ICR mice through tail veins. Brains were removed at sequential time points ranging from 0.5hr to 4hr after injection. The whole brain was cut into 14mm thick coronal sections using a cyrotome. The sections were thaw-mounted on glass plate and apposed placed on an imaging plate with filter paper standards for 24 hours. Imaging reading was done by a Fuji FLA5000 device. Regions of interest were placed on the globus pallidus, hypothalamus, substantia nigra, raphe nuclei and cerebellum corresponding to the sterotaxic atlas, and the PSL/mm 2 values were measured. The specific binding was expressed as the ratios of (targets - cerebellum) to cerebellum. Results: Autoradiography study of brain showed that the [I-123] ADAM was accumulated at serotonin transporter rich sites, including the olfactory tubercle, globus pallidus, thalamus nuclei, hypothalamus, substantia nigra, interpeduncular nucleus, amygdala and raphe nuclei. Biodistribution of [I-123] ADAM in mice brain using quantitative autoradiography method showed a high specific binding in the substantia nigra and hypothalamus and the time-activity curve peaked at 120 min post-injection. Compatible specific binding result was achieved in the region of hypothalamus as compared with previous study by other group using conventional tissue micro-dissection method (Synapse 38:403-412, 2000). However, higher specific binding was observed in certain small brain regions including substantia nigra, raphe nuclei due to improved spatial resolution of the quantitative autoradiography technique. Conclusion: Our result showed that the