WorldWideScience

Sample records for included quantitative analysis

  1. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  2. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  3. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  4. Portable instrumentation for quantitatively measuring radioactive surface contaminations, including 90Sr

    International Nuclear Information System (INIS)

    Brodzinski, R.L.

    1983-10-01

    In order to measure the effectiveness of decontamination efforts, a quantitative analysis of the radiocontamination is necessary, both before and after decontamination. Since it is desirable to release the decontaminated material for unrestricted use or disposal, the assay equipment must provide adequate sensitivity to measure the radioactivity at or below the release limit. In addition, the instrumentation must be capable of measuring all kinds of radiocontaminants including fission products, activation products, and transuranic materials. Finally, the survey instrumentation must be extremely versatile in order to assay the wide variety of contaminated surfaces in many environments, some of which may be extremely hostile or remote. This communication describes the development and application of portable instrumentation capable of quantitatively measuring most transuranics, activation products, and fission products, including 90 Sr, on almost any contaminated surface in nearly any location

  5. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  6. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  7. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  8. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  9. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  10. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  11. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  12. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  13. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  14. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  15. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  16. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  17. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  18. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  19. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  20. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  1. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  2. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  3. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  4. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  5. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  6. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  7. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  8. Quantitative data analysis with SPSS release 8 for Windows a guide for social scientists

    CERN Document Server

    Bryman, Alan

    2002-01-01

    The latest edition of this best-selling introduction to Quantitative Data Analysis through the use of a computer package has been completely updated to accommodate the needs of users of SPSS Release 8 for Windows. Like its predecessor, it provides a non-technical approach to quantitative data analysis and a user-friendly introduction to the widely used SPSS for Windows. It assumes no previous familiarity with either statistics or computing but takes the reader step-by-step through the techniques, reinforced by exercises for further practice. Techniques explained in Quantitative Data Analysis with SPSS Release 8 for Windows include: * correlation * simple and multiple regression * multivariate analysis of variance and covariance * factor analysis The book also covers issues such as sampling, statistical significance, conceptualization and measurement and the selection of appropriate tests. For further information or to download the book's datasets, please visit the webstite: http://www.routledge.com/textbooks/...

  9. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  10. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  11. Quantitative X ray analysis system. User's manual and guide to X ray fluorescence technique

    International Nuclear Information System (INIS)

    2009-01-01

    This guide covers trimmed and re-arranged version 3.6 of the Quantitative X ray Analysis System (QXAS) software package that includes the most frequently used methods of quantitative analysis. QXAS is a comprehensive quantitative analysis package that has been developed by the IAEA through research and technical contracts. Additional development has also been carried out in the IAEA Laboratories in Seibersdorf where QXAS was extensively tested. New in this version of the manual are the descriptions of the Voigt-profile peak fitting, the backscatter fundamental parameters' and emission-transmission methods of chemical composition analysis, an expanded chapter on the X ray fluorescence physics, and completely revised and increased number of practical examples of utilization of the QXAS software package. The analytical data accompanying this manual were collected in the IAEA Seibersdorf Laboratories in the years 2006/2007

  12. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  13. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  14. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  15. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  16. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  17. Teaching Children How to Include the Inversion Principle in Their Reasoning about Quantitative Relations

    Science.gov (United States)

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Bell, Daniel; Barros, Rossana

    2012-01-01

    The basis of this intervention study is a distinction between numerical calculus and relational calculus. The former refers to numerical calculations and the latter to the analysis of the quantitative relations in mathematical problems. The inverse relation between addition and subtraction is relevant to both kinds of calculus, but so far research…

  18. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  19. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  20. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  2. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  3. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  4. Quantitative analysis of eyes and other optical systems in linear optics.

    Science.gov (United States)

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  5. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D. [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study.

  6. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    International Nuclear Information System (INIS)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D.

    2012-01-01

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study

  7. Characterising Ageing in the Human Brainstem Using Quantitative Multimodal MRI Analysis

    Directory of Open Access Journals (Sweden)

    Christian eLambert

    2013-08-01

    Full Text Available Ageing is ubiquitous to the human condition. The MRI correlates of healthy ageing have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI and DTI. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analysing this region. By utilising a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of ageing within the human brainstem in vivo. Using quantitative MRI (qMRI, tensor based morphometry (TBM and voxel based quantification (VBQ, the volumetric and quantitative changes across healthy adults between 19-75 years were characterised. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetisation transfer (MT and increase in proton density (PD, accounting for the previously described midbrain shrinkage. Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterised, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterised by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases.

  8. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  9. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  10. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  11. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  12. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  13. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  14. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  15. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  16. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  17. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  18. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  19. Use of deuteron-induced nuclear reactions for quantitative surface analysis

    International Nuclear Information System (INIS)

    Simpson, J.C.B.; Earwaker, L.G.

    1986-01-01

    A summary of the basic features of nuclear reaction analysis is given; particular emphasis is placed on quantitative light element determination using (d,p) and (d,α) reactions. The experimental apparatus is also described, with reference to the 3MV Dynamitron accelerator at the University of Birmingham Radiation Centre. Finally, a set of standard (d, p) spectra for the elements Z=3 to Z=17, using 2 MeV incident deuterons, is included together with examples of the more useful of the (d,α) spectra. (orig.)

  20. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  2. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  3. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  4. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  5. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  6. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  7. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  8. A study on the quantitative evaluation for the software included in digital systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Kang, H. G.; Lee, K. Y.; Park, J. K. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    In general, probabilistic safety analysis (PSA) has been used as one of the most important methods to evaluate the safety of NPPs. The PSA, because most of NPPs have been installed and used analog I and C systems, has been performed based on the hardware perspectives. In addition, since the tendency to use digital I and C systems including software instead of analog I and C systems is increasing, the needs of quantitative evaluation methods so as to perform PSA are also increasing. Nevertheless, several reasons such as software did not aged and it is very perplexed to estimate software failure rate due to its non-linearity, make the performance of PSA difficult. In this study, in order to perform PSA including software more efficiently, test-based software reliability estimation methods are reviewed to suggest a preliminary procedure that can provide reasonable guidances to quantify software failure rate. In addition, requisite activities to enhance applicability of the suggested procedure are also discussed. 67 refs., 11 figs., 5 tabs. (Author)

  9. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  10. Contribution of the surface contamination of uranium-materials on the quantitative analysis results by electron probe microbeam analysis

    International Nuclear Information System (INIS)

    Bonino, O.; Fournier, C.; Fucili, C.; Dugne, O.; Merlet, C.

    2000-01-01

    The analytical testing of uranium materials is necessary for quality research and development in nuclear industry applications (enrichment, safety studies, fuel, etc). Electron Probe Microbeam Analysis Wavelength Dispersive Spectrometry (EPMA-WDS) is a dependable non-destructive analytical technology. The characteristic X-ray signal is measured to identify and quantify the sample components, and the analyzed volume is about one micron cube. The surface contamination of uranium materials modifies and contributes to the quantitative analysis results of EPMA-WDS. This contribution is not representative of the bulk. A thin oxidized layer appears in the first instants after preparation (burnishing, cleaning) as well as a carbon contamination layer, due to metallographic preparation and carbon cracking under the impact of the electron probe. Several analytical difficulties subsequently arise, including an overlapping line between the carbon Ka ray and the Uranium U NIVOVI ray. Sensitivity and accuracy of the quantification of light elements like carbon and oxygen are also reduced by the presence of uranium. The aim of this study was to improve the accuracy of quantitative analysis on uranium materials by EPMA-WDS by taking account of the contribution of surface contamination. The first part of this paper is devoted to the study of the contaminated surface of the uranium materials U, UFe 2 and U 6 Fe a few hours after preparation. These oxidation conditions are selected so as to reproduce the same contamination surfaces occurring in microprobe analytical conditions. Surface characterization techniques were SIMS and Auger spectroscopy. The contaminated surfaces are shown. They consist of successive layers: a carbon layer, an oxidized iron layer, followed by an iron depletion layer (only in UFe 2 and U 6 Fe), and a ternary oxide layer (U-Fe-O for UFe 2 et U 6 Fe and UO 2+x for uranium). The second part of the paper addresses the estimation of the errors in quantitative

  11. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis

    Science.gov (United States)

    Morelli, Sylvia A.; Sacchet, Matthew D.; Zaki, Jamil

    2015-01-01

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  12. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  13. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  14. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  15. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  16. Quantitative proteomic analysis of post-translational modifications of human histones

    DEFF Research Database (Denmark)

    Beck, Hans Christian; Nielsen, Eva C; Matthiesen, Rune

    2006-01-01

    , and H4 in a site-specific and dose-dependent manner. This unbiased analysis revealed that a relative increase in acetylated peptide from the histone variants H2A, H2B, and H4 was accompanied by a relative decrease of dimethylated Lys(57) from histone H2B. The dose-response results obtained...... by quantitative proteomics of histones from HDACi-treated cells were consistent with Western blot analysis of histone acetylation, cytotoxicity, and dose-dependent expression profiles of p21 and cyclin A2. This demonstrates that mass spectrometry-based quantitative proteomic analysis of post-translational...

  17. Quantitative analysis of flow processes in a sand using synchrotron-based X-ray microtomography

    DEFF Research Database (Denmark)

    Wildenschild, Dorthe; Hopmans, J.W.; Rivers, M.L.

    2005-01-01

    been of a mostly qualitative nature and no experiments have been presented in the existing literature where a truly quantitative approach to investigating the multiphase flow process has been taken, including a thorough image-processing scheme. The tomographic images presented here show, both......Pore-scale multiphase flow experiments were developed to nondestructively visualize water flow in a sample of porous material using X-ray microtomography. The samples were exposed to similar boundary conditions as in a previous investigation, which examined the effect of initial flow rate...... by qualitative comparison and quantitative analysis in the form of a nearest neighbor analysis, that the dynamic effects seen in previous experiments are likely due to the fast and preferential drainage of large pores in the sample. Once a continuous drained path has been established through the sample, further...

  18. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  19. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  20. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  1. Quantitative analysis of phytosterols in edible oils using APCI liquid chromatography-tandem mass spectrometry

    Science.gov (United States)

    Mo, Shunyan; Dong, Linlin; Hurst, W. Jeffrey; van Breemen, Richard B.

    2014-01-01

    Previous methods for the quantitative analysis of phytosterols have usually used GC-MS and require elaborate sample preparation including chemical derivatization. Other common methods such as HPLC with absorbance detection do not provide information regarding the identity of the analytes. To address the need for an assay that utilizes mass selectivity while avoiding derivatization, a quantitative method based on LC-tandem mass spectrometry (LC-MS-MS) was developed and validated for the measurement of six abundant dietary phytosterols and structurally related triterpene alcohols including brassicasterol, campesterol, cycloartenol, β-sitosterol, stigmasterol, and lupeol in edible oils. Samples were saponified, extracted with hexane and then analyzed using reversed phase HPLC with positive ion atmospheric pressure chemical ionization tandem mass spectrometry and selected reaction monitoring. The utility of the LC-MS-MS method was demonstrated by analyzing 14 edible oils. All six compounds were present in at least some of the edible oils. The most abundant phytosterol in all samples was β-sitosterol, which was highest in corn oil at 4.35 ± 0.03 mg/g, followed by campesterol in canola oil at 1.84 ± 0.01 mg/g. The new LC-MS-MS method for the quantitative analysis of phytosterols provides a combination of speed, selectivity and sensitivity that exceed those of previous assays. PMID:23884629

  2. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  3. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  4. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  5. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  6. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  7. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  8. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  9. Evaluation of breast lesions by contrast enhanced ultrasound: Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Wan Caifeng; Du Jing; Fang Hua; Li Fenghua; Wang Lin

    2012-01-01

    Objective: To evaluate and compare the diagnostic performance of qualitative, quantitative and combined analysis for characterization of breast lesions in contrast enhanced ultrasound (CEUS), with histological results used as the reference standard. Methods: Ninety-one patients with 91 breast lesions BI-RADS 3–5 at US or mammography underwent CEUS. All lesions underwent qualitative and quantitative enhancement evaluation. Receiver operating characteristic (ROC) curve analysis was performed to evaluate the diagnostic performance of different analytical method for discrimination between benign and malignant breast lesions. Results: Histopathologic analysis of the 91 lesions revealed 44 benign and 47 malignant. For qualitative analysis, benign and malignant lesions differ significantly in enhancement patterns (p z1 ), 0.768 (A z2 ) and 0.926(A z3 ) respectively. The values of A z1 and A z3 were significantly higher than that for A z2 (p = 0.024 and p = 0.008, respectively). But there was no significant difference between the values of A z1 and A z3 (p = 0.625). Conclusions: The diagnostic performance of qualitative and combined analysis was significantly higher than that for quantitative analysis. Although quantitative analysis has the potential to differentiate benign from malignant lesions, it has not yet improved the final diagnostic accuracy.

  10. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis.

    Science.gov (United States)

    Razi Naqvi, K

    2014-04-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens' theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells.

  11. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  12. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  13. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  14. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    Science.gov (United States)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  15. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  16. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  17. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  18. Preliminary outcomes of the IUCR CPD round robin on quantitative phase analysis

    International Nuclear Information System (INIS)

    Madsen, I.

    1999-01-01

    Full text: The International Union for Crystallography - Commission on Powder Diffraction is currently sponsoring a round robin on quantitative phase analysis (QPA). The round robin is focusing on the analysis of powder diffraction data, namely laboratory X-ray, synchrotron X-ray and neutron diffraction data for the derivation of phase abundances. The general goals of the round robin include the following: 1. To document the methods and strategies commonly employed in quantitative phase analysis, especially those involving powder diffraction. 2. To assess (i) levels of accuracy and precision, and (ii) lower limits of detection of methods used in QPA. 3. To identify specific problem areas and develop practical solutions. 4. To formulate recommended procedures for QPA using diffraction data. 5. To create a standard set of samples for future reference. The samples used in the study consist of mixtures of major and minor components covering a wide range of analytical complexity. Initial samples are synthetic mixtures of crystallographically 'simple' materials and should present little problem to the analyst. Additional samples introduce problems such as preferred orientation, microabsorption and amorphous content to assess the degree to which these problems affect QPA. Several very complex materials have also been included in the sample suite including a natural granodiorite, synthetic bauxite and a mixture of pharmaceutical phases. These last three samples represent a significant analytical challenge as they exhibit preferred orientation, microabsorption and grain size effects in addition to severe peak overlap. The round robin was tailored to allow variation in the level of participation including (i) analysis of 'standard' data sets supplied by the CPD, (ii) collection and analysis of data from at least two of the samples supplied by the CPD and (iii) selection of additional samples at the discretion of the participant. At the time of writing this abstract, some 130

  19. Network analysis of quantitative proteomics on asthmatic bronchi: effects of inhaled glucocorticoid treatment

    Directory of Open Access Journals (Sweden)

    Sihlbom Carina

    2011-09-01

    Full Text Available Abstract Background Proteomic studies of respiratory disorders have the potential to identify protein biomarkers for diagnosis and disease monitoring. Utilisation of sensitive quantitative proteomic methods creates opportunities to determine individual patient proteomes. The aim of the current study was to determine if quantitative proteomics of bronchial biopsies from asthmatics can distinguish relevant biological functions and whether inhaled glucocorticoid treatment affects these functions. Methods Endobronchial biopsies were taken from untreated asthmatic patients (n = 12 and healthy controls (n = 3. Asthmatic patients were randomised to double blind treatment with either placebo or budesonide (800 μg daily for 3 months and new biopsies were obtained. Proteins extracted from the biopsies were digested and analysed using isobaric tags for relative and absolute quantitation combined with a nanoLC-LTQ Orbitrap mass spectrometer. Spectra obtained were used to identify and quantify proteins. Pathways analysis was performed using Ingenuity Pathway Analysis to identify significant biological pathways in asthma and determine how the expression of these pathways was changed by treatment. Results More than 1800 proteins were identified and quantified in the bronchial biopsies of subjects. The pathway analysis revealed acute phase response signalling, cell-to-cell signalling and tissue development associations with proteins expressed in asthmatics compared to controls. The functions and pathways associated with placebo and budesonide treatment showed distinct differences, including the decreased association with acute phase proteins as a result of budesonide treatment compared to placebo. Conclusions Proteomic analysis of bronchial biopsy material can be used to identify and quantify proteins using highly sensitive technologies, without the need for pooling of samples from several patients. Distinct pathophysiological features of asthma can be

  20. Quantitative analysis of trivalent uranium and lanthanides in a molten chloride by absorption spectrophotometry

    International Nuclear Information System (INIS)

    Toshiyuki Fujii; Akihiro Uehara; Hajimu Yamana

    2013-01-01

    As an analytical application for pyrochemical reprocessing using molten salts, quantitative analysis of uranium and lanthanides by UV/Vis/NIR absorption spectrophotometry was performed. Electronic absorption spectra of LiCl-KCl eutectic at 773 K including trivalent uranium and eight rare earth elements (Y, La, Ce, Pr, Nd, Sm, Eu, and Gd as fission product elements) were measured in the wavenumber region of 4,500-33,000 cm -1 . The composition of the solutes was simulated for a reductive extraction condition in a pyroreprocessing process for spent nuclear fuels, that is, about 2 wt% U and 0.1-2 wt% rare earth elements. Since U(III) possesses strong absorption bands due to f-d transitions, an optical quartz cell with short light path length of 1 mm was adopted in the analysis. The quantitative analysis of trivalent U, Nd, Pr, and Sm was possible with their f-f transition intensities in the NIR region. The analytical results agree with the prepared concentrations within 2σ experimental uncertainties. (author)

  1. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  2. Role of image analysis in quantitative characterisation of nuclear fuel materials

    International Nuclear Information System (INIS)

    Dubey, J.N.; Rao, T.S.; Pandey, V.D.; Majumdar, S.

    2005-01-01

    Image analysis is one of the important techniques, widely used for materials characterization. It provides the quantitative estimation of the microstructural features present in the material. This information is very much valuable for finding out the criteria for taking up the fuel for high burn up. Radiometallurgy Division has been carrying out development and fabrication of plutonium related fuels for different type of reactors viz. Purnima, Fast Breeder Test Reactor (FBTR), Prototype Fast Breeder Reactor (PFBR), Boiling Water Reactor (BWR), Advanced Heavy Water Reactor (AHWR), Pressurised Heavy Water Reactor (PHWR) and KAMINI Reactor. Image analysis has been carried out on microstructures of PHWR, AHWR, FBTR and KAMINI fuels. Samples were prepared as per standard ASTM metallographic procedure. Digital images of the microstructure of these specimens were obtained using CCD camera, attached to the optical microscope. These images are stores on computer and used for detection and analysis of features of interest with image analysis software. Quantitative image analysis technique has been standardised and used for finding put type of the porosity, its size, shape and distribution in the above sintered oxide and carbide fuels. This technique has also been used for quantitative estimation of different phases present in KAMINI fuel. Image analysis results have been summarised and presented in this paper. (author)

  3. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  4. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  5. Quantitative analysis of exercise 201Tl myocardial emission CT in patients with coronary artery disease

    International Nuclear Information System (INIS)

    Okada, Mitsuhiro; Kawai, Naoki; Yamamoto, Shuhei

    1984-01-01

    The clinical usefulness of quantitative analysis of exercise thallium-201 myocardial emission computed tomography (ECT) was evaluated in coronary artery disease (CAD). The subjects consisted of 20 CAD patients and five normal controls. All CAD patients underwent coronary angiography. Tomographic thallium-201 myocardial imaging was performed with a rotating gamma camera, and long-axial and short-axial myocardial images of the left ventricle were reconstructed. The tomographic images were interpreted quantitatively using circumferential profile analysis. Based on features of regional myocardial thallium-201 kinetics, two types of abnormalities were studied: (1) diminished initial distribution (stress defect) and (2) slow washout of thallium-201, as evidenced by patients' initial thallium-201 uptake and 3-hour washout rate profiles which fell below the normal limits, respectively. Two diagnostic criteria including the stress defect and a combination of the stress defect and slow washout were used to detect coronary artery lesions of significance (>=75 % luminal narrowing). The ischemic volumes were also evaluated by quantitative analysis using thallium-201 ECT. The diagnostic accuracy of the stress defect criterion was 95 % for left anterior descending, 90 % for right, and 70 % for left circumflex coronary artery lesions. The combined criteria of the stress defect and slow washout increased detection sensitivity with a moderate loss of specificity for identifying individual coronary artery lesion. A relatively high diagnostic accuracy was obtained using the stress defect criterion for multiple vessel disease (75 %). Ischemic myocardial volume was significantly larger in triple vessel than in single vessel disease (p < 0.05) using the combined criteria. It was concluded that quantitative analysis of exercise thallium-201 myocardial ECT images proves useful for evaluating coronary artery lesions. (author)

  6. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  7. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  8. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  9. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  10. Critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P T; McCulloch, J [Glasgow Univ. (UK)

    1983-06-13

    Semi-quantitative analysis (e.g. optical density ratios) of (/sup 14/C)2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of /sup 14/C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of /sup 14/C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of (/sup 14/C)2-deoxyglucose autoradiograms is undertaken.

  11. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  12. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  13. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  14. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Athalye, Rahul A.; Rosenberg, Michael I.; Xie, YuLong; Wang, Weimin; Hart, Philip R.; Zhang, Jian; Goel, Supriya; Mendon, Vrushali V.

    2014-09-04

    This report provides a final quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in improved energy efficiency in commercial buildings. The final analysis considered each of the 110 addenda to Standard 90.1-2010 that were included in Standard 90.1-2013. PNNL reviewed all addenda included by ASHRAE in creating Standard 90.1-2013 from Standard 90.1-2010, and considered their combined impact on a suite of prototype building models across all U.S. climate zones. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 110 total addenda, 30 were identified as having a measureable and quantifiable impact.

  15. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  16. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  17. Use of local noise power spectrum and wavelet analysis in quantitative image quality assurance for EPIDs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Soyoung [Department of Radiation Oncology, University Hospitals Case and Medical Center, Cleveland, Ohio 44106 (United States); Yan, Guanghua; Bassett, Philip; Samant, Sanjiv, E-mail: samant@ufl.edu [Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida 32608 (United States); Gopal, Arun [Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, Maryland 21201 (United States)

    2016-09-15

    Purpose: To investigate the use of local noise power spectrum (NPS) to characterize image noise and wavelet analysis to isolate defective pixels and inter-subpanel flat-fielding artifacts for quantitative quality assurance (QA) of electronic portal imaging devices (EPIDs). Methods: A total of 93 image sets including custom-made bar-pattern images and open exposure images were collected from four iViewGT a-Si EPID systems over three years. Global quantitative metrics such as modulation transform function (MTF), NPS, and detective quantum efficiency (DQE) were computed for each image set. Local NPS was also calculated for individual subpanels by sampling region of interests within each subpanel of the EPID. The 1D NPS, obtained by radially averaging the 2D NPS, was fitted to a power-law function. The r-square value of the linear regression analysis was used as a singular metric to characterize the noise properties of individual subpanels of the EPID. The sensitivity of the local NPS was first compared with the global quantitative metrics using historical image sets. It was then compared with two commonly used commercial QA systems with images collected after applying two different EPID calibration methods (single-level gain and multilevel gain). To detect isolated defective pixels and inter-subpanel flat-fielding artifacts, Haar wavelet transform was applied on the images. Results: Global quantitative metrics including MTF, NPS, and DQE showed little change over the period of data collection. On the contrary, a strong correlation between the local NPS (r-square values) and the variation of the EPID noise condition was observed. The local NPS analysis indicated image quality improvement with the r-square values increased from 0.80 ± 0.03 (before calibration) to 0.85 ± 0.03 (after single-level gain calibration) and to 0.96 ± 0.03 (after multilevel gain calibration), while the commercial QA systems failed to distinguish the image quality improvement between the two

  18. Calibration strategy for semi-quantitative direct gas analysis using inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Gerdes, Kirk; Carter, Kimberly E.

    2011-01-01

    A process is described by which an ICP-MS equipped with an Octopole Reaction System (ORS) is calibrated using liquid phase standards to facilitate direct analysis of gas phase samples. The instrument response to liquid phase standards is analyzed to produce empirical factors relating ion generation and transmission efficiencies to standard operating parameters. Empirical factors generated for liquid phase samples are then used to produce semi-quantitative analysis of both mixed liquid/gas samples and pure gas samples. The method developed is similar to the semi-quantitative analysis algorithms in the commercial software, which have here been expanded to include gas phase elements such as Xe and Kr. Equations for prediction of relative ionization efficiencies and isotopic transmission are developed for several combinations of plasma operating conditions, which allows adjustment of limited parameters between liquid and gas injection modes. In particular, the plasma temperature and electron density are calculated from comparison of experimental results to the predictions of the Saha equation. Comparisons between operating configurations are made to determine the robustness of the analysis to plasma conditions and instrument operating parameters. Using the methods described in this research, the elemental concentrations in a liquid standard containing 45 analytes and treated as an unknown sample were quantified accurately to ± 50% for most elements using 133 Cs as a single internal reference. The method is used to predict liquid phase mercury within 12% of the actual concentration and gas phase mercury within 28% of the actual concentration. The results verify that the calibration method facilitates accurate semi-quantitative, gas phase analysis of metal species with sufficient sensitivity to quantify metal concentrations lower than 1 ppb for many metallic analytes.

  19. Persistence of Low Pathogenic Influenza A Virus in Water: A Systematic Review and Quantitative Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Antonia E Dalziel

    Full Text Available Avian influenza viruses are able to persist in the environment, in-between the transmission of the virus among its natural hosts. Quantifying the environmental factors that affect the persistence of avian influenza virus is important for influencing our ability to predict future outbreaks and target surveillance and control methods. We conducted a systematic review and quantitative meta-analysis of the environmental factors that affect the decay of low pathogenic avian influenza virus (LPAIV in water. Abiotic factors affecting the persistence of LPAIV have been investigated for nearly 40 years, yet published data was produced by only 26 quantitative studies. These studies have been conducted by a small number of principal authors (n = 17 and have investigated a narrow range of environmental conditions, all of which were based in laboratories with limited reflection of natural conditions. The use of quantitative meta-analytic techniques provided the opportunity to assess persistence across a greater range of conditions than each individual study can achieve, through the estimation of mean effect-sizes and relationships among multiple variables. Temperature was the most influential variable, for both the strength and magnitude of the effect-size. Moderator variables explained a large proportion of the heterogeneity among effect-sizes. Salinity and pH were important factors, although future work is required to broaden the range of abiotic factors examined, as well as including further diurnal variation and greater environmental realism generally. We were unable to extract a quantitative effect-size estimate for approximately half (50.4% of the reported experimental outcomes and we strongly recommend a minimum set of quantitative reporting to be included in all studies, which will allow robust assimilation and analysis of future findings. In addition we suggest possible means of increasing the applicability of future studies to the natural

  20. Quantitative X-ray analysis of biological fluids: the microdroplet technique

    International Nuclear Information System (INIS)

    Roinel, N.

    1988-01-01

    X-ray microanalysis can be used to quantitatively determine the elemental composition of microvolumes of biological fluids. This article describes the various steps in preparation of microdroplets for analysis: The manufacturing of micropipettes, the preparation of the specimen support, the deposition of droplets on the support, shock-freezing, and lyophilization. Examples of common artifacts (incomplete rehydration prior to freezing or partial rehydration after lyophilization) are demonstrated. Analysis can be carried out either by wavelength-dispersive analysis, which is the most sensitive method, or by energy-dispersive analysis, which is more commonly available. The minimum detectable concentration is 0.05 mmol.liter-1 for 0.1-nl samples analyzed by wavelength-dispersive spectrometry and 0.5-1 mmol.liter-1 for samples analyzed by energy-dispersive spectrometry. A major problem, especially in wavelength-dispersive analysis, where high beam currents are used, is radiation damage to the specimen; in particular chloride (but also other elements) can be lost. Quantitative analysis requires the use of standard solutions with elemental concentration in the same range as those present in the specimen

  1. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  2. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  3. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  4. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  5. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  6. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. MetaFluxNet: the management of metabolic reaction information and quantitative metabolic flux analysis.

    Science.gov (United States)

    Lee, Dong-Yup; Yun, Hongsoek; Park, Sunwon; Lee, Sang Yup

    2003-11-01

    MetaFluxNet is a program package for managing information on the metabolic reaction network and for quantitatively analyzing metabolic fluxes in an interactive and customized way. It allows users to interpret and examine metabolic behavior in response to genetic and/or environmental modifications. As a result, quantitative in silico simulations of metabolic pathways can be carried out to understand the metabolic status and to design the metabolic engineering strategies. The main features of the program include a well-developed model construction environment, user-friendly interface for metabolic flux analysis (MFA), comparative MFA of strains having different genotypes under various environmental conditions, and automated pathway layout creation. http://mbel.kaist.ac.kr/ A manual for MetaFluxNet is available as PDF file.

  8. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    Science.gov (United States)

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p quantitative BPE (r = 0.63, p Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  10. Towards quantitative laser-induced breakdown spectroscopy analysis of soil samples

    International Nuclear Information System (INIS)

    Bousquet, B.; Sirven, J.-B.; Canioni, L.

    2007-01-01

    A quantitative analysis of chromium in soil samples is presented. Different emission lines related to chromium are studied in order to select the best one for quantitative features. Important matrix effects are demonstrated from one soil to the other, preventing any prediction of concentration in different soils on the basis of a univariate calibration curve. Finally, a classification of the LIBS data based on a series of Principal Component Analyses (PCA) is applied to a reduced dataset of selected spectral lines related to the major chemical elements in the soils. LIBS data of heterogeneous soils appear to be widely dispersed, which leads to a reconsideration of the sampling step in the analysis process

  11. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  12. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  13. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  14. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  17. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  18. Single particle transfer for quantitative analysis with total-reflection X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Esaka, Fumitaka; Esaka, Konomi T.; Magara, Masaaki; Sakurai, Satoshi; Usuda, Shigekazu; Watanabe, Kazuo

    2006-01-01

    The technique of single particle transfer was applied to quantitative analysis with total-reflection X-ray fluorescence (TXRF) spectrometry. The technique was evaluated by performing quantitative analysis of individual Cu particles with diameters between 3.9 and 13.2 μm. The direct quantitative analysis of the Cu particle transferred onto a Si carrier gave a discrepancy between measured and calculated Cu amounts due to the absorption effects of incident and fluorescent X-rays within the particle. By the correction for the absorption effects, the Cu amounts in individual particles could be determined with the deviation within 10.5%. When the Cu particles were dissolved with HNO 3 solution prior to the TXRF analysis, the deviation was improved to be within 3.8%. In this case, no correction for the absorption effects was needed for quantification

  19. Analysis of high accuracy, quantitative proteomics data in the MaxQB database.

    Science.gov (United States)

    Schaab, Christoph; Geiger, Tamar; Stoehr, Gabriele; Cox, Juergen; Mann, Matthias

    2012-03-01

    MS-based proteomics generates rapidly increasing amounts of precise and quantitative information. Analysis of individual proteomic experiments has made great strides, but the crucial ability to compare and store information across different proteome measurements still presents many challenges. For example, it has been difficult to avoid contamination of databases with low quality peptide identifications, to control for the inflation in false positive identifications when combining data sets, and to integrate quantitative data. Although, for example, the contamination with low quality identifications has been addressed by joint analysis of deposited raw data in some public repositories, we reasoned that there should be a role for a database specifically designed for high resolution and quantitative data. Here we describe a novel database termed MaxQB that stores and displays collections of large proteomics projects and allows joint analysis and comparison. We demonstrate the analysis tools of MaxQB using proteome data of 11 different human cell lines and 28 mouse tissues. The database-wide false discovery rate is controlled by adjusting the project specific cutoff scores for the combined data sets. The 11 cell line proteomes together identify proteins expressed from more than half of all human genes. For each protein of interest, expression levels estimated by label-free quantification can be visualized across the cell lines. Similarly, the expression rank order and estimated amount of each protein within each proteome are plotted. We used MaxQB to calculate the signal reproducibility of the detected peptides for the same proteins across different proteomes. Spearman rank correlation between peptide intensity and detection probability of identified proteins was greater than 0.8 for 64% of the proteome, whereas a minority of proteins have negative correlation. This information can be used to pinpoint false protein identifications, independently of peptide database

  20. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  1. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  2. Quantitative analysis of biological responses to low dose-rate γ-radiation, including dose, irradiation time, and dose-rate

    International Nuclear Information System (INIS)

    Magae, J.; Furukawa, C.; Kawakami, Y.; Hoshi, Y.; Ogata, H.

    2003-01-01

    Full text: Because biological responses to radiation are complex processes dependent on irradiation time as well as total dose, it is necessary to include dose, dose-rate and irradiation time simultaneously to predict the risk of low dose-rate irradiation. In this study, we analyzed quantitative relationship among dose, irradiation time and dose-rate, using chromosomal breakage and proliferation inhibition of human cells. For evaluation of chromosome breakage we assessed micronuclei induced by radiation. U2OS cells, a human osteosarcoma cell line, were exposed to gamma-ray in irradiation room bearing 50,000 Ci 60 Co. After the irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, cytoplasm and nucleus were stained with DAPI and propidium iodide, and the number of binuclear cells bearing micronuclei was determined by fluorescent microscopy. For proliferation inhibition, cells were cultured for 48 h after the irradiation and [3H] thymidine was pulsed for 4 h before harvesting. Dose-rate in the irradiation room was measured with photoluminescence dosimeter. While irradiation time less than 24 h did not affect dose-response curves for both biological responses, they were remarkably attenuated as exposure time increased to more than 7 days. These biological responses were dependent on dose-rate rather than dose when cells were irradiated for 30 days. Moreover, percentage of micronucleus-forming cells cultured continuously for more than 60 days at the constant dose-rate, was gradually decreased in spite of the total dose accumulation. These results suggest that biological responses at low dose-rate, are remarkably affected by exposure time, that they are dependent on dose-rate rather than total dose in the case of long-term irradiation, and that cells are getting resistant to radiation after the continuous irradiation for 2 months. It is necessary to include effect of irradiation time and dose-rate sufficiently to evaluate risk

  3. Testicular dysgenesis syndrome and the estrogen hypothesis: a quantitative meta-analysis.

    Science.gov (United States)

    Martin, Olwenn V; Shialis, Tassos; Lester, John N; Scrimshaw, Mark D; Boobis, Alan R; Voulvoulis, Nikolaos

    2008-02-01

    Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-alpha-mediated mode of action was specifically explored. We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population.

  4. Quantitative analysis of untreated oil samples in in-air PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Goto, S.; Takahashi, C.; Saitoh, Y.

    2010-01-01

    The method of quantitative analysis of oil samples in in-air PIXE has been developed on the basis of a standard-free method. The components of the continuous X-rays originated from air and backing film can be exactly subtracted using a blank spectrum after normalization by the yields of Ar K-α X-rays. The method was developed using nine oil samples including standard oils and its accuracy was confirmed by comparing the results with those obtained by the internal-standard method. Validity of the method for practical oil samples was confirmed for various kinds of oils such as engine, machine and cooking oils. It was found that the method is effective for various kinds of oils whatever elements we designate as an index element. (author)

  5. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  6. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.

  7. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    Science.gov (United States)

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  8. Quantitative schemes in energy dispersive X-ray fluorescence implemented in AXIL

    International Nuclear Information System (INIS)

    Tchantchane, A.; Benamar, M.A.; Tobbeche, S.

    1995-01-01

    E.D.X.R.F (Energy Dispersive X-ray Fluorescence) has long been used for quantitative analysis of many types of samples including environment samples. the software package AXIL (Analysis of x-ray spectra by iterative least quares) is extensively used for the spectra analysis and the quantification of x-ray spectra. It includes several methods of quantitative schemes for evaluating element concentrations. We present the general theory behind each scheme implemented into the software package. The spectra of the performance of each of these quantitative schemes. We have also investigated their performance relative to the uncertainties in the experimental parameters and sample description

  9. Clinical significance of quantitative analysis of thyroid peroxidase antibody (TPOAb) with chemiluminescence enzyme immunoassay

    International Nuclear Information System (INIS)

    Zhu Cuiying; Wang Qing; Huang Gang

    2004-01-01

    Objective: The only method of laboratory diagnosis for autoimmune thyroid diseases used to be serum TGA and TMA detections. Morerecently, quantitative analysis of TPOAb has been introduced. To assess the relative sensitivity of these tests , positive rates detected with the respective tests were compared. Methods: Serum TGA, TMA (with RIA) and TPOAb (with chemiluminescence enzyme immunoassay) were simultaneously detected in 998 cases of thyroid diseases (hyperthyroidism 307, Hashimoto's disease 193, simple goiter 498). For complementary sake, fine needle aspiration cytology was obtained in a number of cases including all the patients with Hashimoto's disease. Results: Positive detection rate of TPOAb in three groups of patients (hyperthyroidism, Hashimoto's, simple goiter) was 81.76%, 96.89 % and 42.97% respectively. With TMA, the positive rate was only 54.72%, 65.80%, 22.09% respectively. About one third more cases would be detected with the newer method. Conclusion: For the laboratory detection of auto immune thyroid diseases, quantitative analysis of TPOAb is much wore sensitive than the conventional TMA detection. (authors)

  10. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  11. Qualitative and quantitative temporal analysis of licit and illicit drugs in wastewater in Australia using liquid chromatography coupled to mass spectrometry.

    Science.gov (United States)

    Bade, Richard; White, Jason M; Gerber, Cobus

    2018-01-01

    The combination of qualitative and quantitative bimonthly analysis of pharmaceuticals and illicit drugs using liquid chromatography coupled to mass spectrometry is presented. A liquid chromatography-quadrupole time of flight instrument equipped with Sequential Window Acquisition of all THeoretical fragment-ion spectra (SWATH) was used to qualitatively screen 346 compounds in influent wastewater from two wastewater treatment plants in South Australia over a 14-month period. A total of 100 compounds were confirmed and/or detected using this strategy, with 61 confirmed in all samples including antidepressants (amitriptyline, dothiepin, doxepin), antipsychotics (amisulpride, clozapine), illicit drugs (cocaine, methamphetamine, amphetamine, 3,4-methylenedioxymethamphetamine (MDMA)), and known drug adulterants (lidocaine and tetramisole). A subset of these compounds was also included in a quantitative method, analyzed on a liquid chromatography-triple quadrupole mass spectrometer. The use of illicit stimulants (methamphetamine) showed a clear decrease, levels of opioid analgesics (morphine and methadone) remained relatively stable, while the use of new psychoactive substances (methylenedioxypyrovalerone (MDPV) and Alpha PVP) varied with no visible trend. This work demonstrates the value that high-frequency sampling combined with quantitative and qualitative analysis can deliver. Graphical abstract Temporal analysis of licit and illicit drugs in South Australia.

  12. Qualitative and quantitative analysis of environmental samples by laser-induced breakdown spectrometry

    International Nuclear Information System (INIS)

    Zorov, N B; Popov, A M; Zaytsev, S M; Labutin, T A

    2015-01-01

    The key achievements in the determination of trace amounts of components in environmental samples (soils, ores, natural waters, etc.) by laser-induced breakdown spectrometry are considered. Unique capabilities of this method make it suitable for rapid analysis of metals and alloys, glasses, polymers, objects of cultural heritage, archaeological and various environmental samples. The key advantages of the method that account for its high efficiency are demonstrated, in particular, a small amount of analyzed material, the absence of sample preparation, the possibility of local and remote analysis of either one or several elements. The use of chemometrics in laser-induced breakdown spectrometry for qualitative sample classification is described in detail. Various approaches to improving the figures of merit of quantitative analysis of environmental samples are discussed. The achieved limits of detection for most elements in geochemical samples are critically evaluated. The bibliography includes 302 references

  13. A critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    International Nuclear Information System (INIS)

    Kelly, P.T.; McCulloch, J.

    1983-01-01

    Semi-quantitative analysis (e.g. optical density ratios) of [ 14 C]2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of 14 C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of 14 C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of [ 14 C]2-deoxyglucose autoradiograms is undertaken. (Auth.)

  14. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  15. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  16. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  17. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    Science.gov (United States)

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  18. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  19. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  20. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  2. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  3. QTest: Quantitative Testing of Theories of Binary Choice

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  4. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  5. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  6. Design database for quantitative trait loci (QTL) data warehouse, data mining, and meta-analysis.

    Science.gov (United States)

    Hu, Zhi-Liang; Reecy, James M; Wu, Xiao-Lin

    2012-01-01

    A database can be used to warehouse quantitative trait loci (QTL) data from multiple sources for comparison, genomic data mining, and meta-analysis. A robust database design involves sound data structure logistics, meaningful data transformations, normalization, and proper user interface designs. This chapter starts with a brief review of relational database basics and concentrates on issues associated with curation of QTL data into a relational database, with emphasis on the principles of data normalization and structure optimization. In addition, some simple examples of QTL data mining and meta-analysis are included. These examples are provided to help readers better understand the potential and importance of sound database design.

  7. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  8. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  9. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    Science.gov (United States)

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  10. Quantitative analysis of elastography images in the detection of breast cancer

    International Nuclear Information System (INIS)

    Landoni, V.; Francione, V.; Marzi, S.; Pasciuti, K.; Ferrante, F.; Saracca, E.; Pedrini, M.; Strigari, L.; Crecco, M.; Di Nallo, A.

    2012-01-01

    Purpose: The aim of this study was to develop a quantitative method for breast cancer diagnosis based on elastosonography images in order to reduce whenever possible unnecessary biopsies. The proposed method was validated by correlating the results of quantitative analysis with the diagnosis assessed by histopathologic exam. Material and methods: 109 images of breast lesions (50 benign and 59 malignant) were acquired with the traditional B-mode technique and with elastographic modality. Images in Digital Imaging and COmmunications in Medicine format (DICOM) were exported into a software, written in Visual Basic, especially developed to perform this study. The lesion was contoured and the mean grey value and softness inside the region of interest (ROI) were calculated. The correlations between variables were investigated and receiver operating characteristic (ROC) curve analysis was performed to assess the diagnostic accuracy of the proposed method. Pathologic results were used as standard reference. Results: Both the mean grey value and the softness inside the ROI resulted statistically different at the t test for the two populations of lesions (i.e., benign versus malignant): p < 0.0001. The area under the curve (AUC) was 0.924 (0.834–0.973) and 0.917 (0.826–0.970) for the mean grey value and for the softness respectively. Conclusions: Quantitative elastosonography is a promising ultrasound technique in the detection of breast cancer but large prospective trials are necessary to determine whether quantitative analysis of images can help to overcome some pitfalls of the methodic.

  11. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    Science.gov (United States)

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  12. Quantitative proteomic analysis of ibuprofen-degrading Patulibacter sp. strain I11

    DEFF Research Database (Denmark)

    Almeida, Barbara; Kjeldal, Henrik; Lolas, Ihab Bishara Yousef

    2013-01-01

    was identified and quantified by gel based shotgun-proteomics. In total 251 unique proteins were quantitated using this approach. Biological process and pathway analysis indicated a number of proteins that were up-regulated in response to active degradation of ibuprofen, some of them are known to be involved...... in the degradation of aromatic compounds. Data analysis revealed that several of these proteins are likely involved in ibuprofen degradation by Patulibacter sp. strain I11.......Ibuprofen is the third most consumed pharmaceutical drug in the world. Several isolates have been shown to degrade ibuprofen, but very little is known about the biochemistry of this process. This study investigates the degradation of ibuprofen by Patulibacter sp. strain I11 by quantitative...

  13. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  14. A new quantitative analysis on nitriding kinetics in the oxidized Zry-4 at 900-1200 .deg. C

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sanggi [ACT Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    Two major roles of nitrogen on the zirconium based cladding degradation were identified: mechanical degradation of the cladding, and the additional chemical heat release. It has long been known that accelerated oxidation can occur in air due to the nitrogen. In addition, significant uptake of nitrogen can also occur. The nitriding of pre-oxidized zirconium based alloys leads to micro porous and less coherent oxide scales. This paper aims to quantitatively investigate the nitriding mechanism and kinetics by proposing a new methodology that is coupled with the mass balance analysis and the optical microscope image processing analysis. A new quantitative analysis methodology is described in chapter 2 and the investigation of the nitriding kinetics is performed in chapter 3. The experimental details are previously reported in. Previously only qualitative analysis was performed in, and hence the quantitative analysis will be performed in this paper. In this paper, the nitriding kinetics and mechanism were quantitatively analyzed by the new proposed analysis methods: the mass balance analysis and the optical microscope image processing analysis. Using these combined methods, the mass gain curves and the optical microscopes are analyzed in very detail, and the mechanisms of nitriding accelerated, stabilized and saturated behaviors were well understood. This paper has two very distinctive achievements as follows: 1) Development of very effective quantitative analysis methods only using two main results of oxidation tests: No detailed analytical sample measurements (e.g. TEM, EPMA and so on.) were required. These methods can effectively reduce the cost and effort of the post-test investigation. 2) The first identification of the nitriding behaviors and its very accurate analysis in a quantitative way. Based on this quantitative analysis results on the nitriding kinetics, these new findings will contribute significantly the understanding the air oxidation behaviors and model

  15. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  16. Quantitative and qualitative analysis of semantic verbal fluency in patients with temporal lobe epilepsy.

    Science.gov (United States)

    Jaimes-Bautista, A G; Rodríguez-Camacho, M; Martínez-Juárez, I E; Rodríguez-Agudelo, Y

    2017-08-29

    Patients with temporal lobe epilepsy (TLE) perform poorly on semantic verbal fluency (SVF) tasks. Completing these tasks successfully involves multiple cognitive processes simultaneously. Therefore, quantitative analysis of SVF (number of correct words in one minute), conducted in most studies, has been found to be insufficient to identify cognitive dysfunction underlying SVF difficulties in TLE. To determine whether a sample of patients with TLE had SVF difficulties compared with a control group (CG), and to identify the cognitive components associated with SVF difficulties using quantitative and qualitative analysis. SVF was evaluated in 25 patients with TLE and 24 healthy controls; the semantic verbal fluency test included 5 semantic categories: animals, fruits, occupations, countries, and verbs. All 5 categories were analysed quantitatively (number of correct words per minute and interval of execution: 0-15, 16-30, 31-45, and 46-60seconds); the categories animals and fruits were also analysed qualitatively (clusters, cluster size, switches, perseverations, and intrusions). Patients generated fewer words for all categories and intervals and fewer clusters and switches for animals and fruits than the CG (Psize and number of intrusions and perseverations (P>.05). Our results suggest an association between SVF difficulties in TLE and difficulty activating semantic networks, impaired strategic search, and poor cognitive flexibility. Attention, inhibition, and working memory are preserved in these patients. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  18. Qualitative and Quantitative Analysis of the Major Constituents in Chinese Medical Preparation Lianhua-Qingwen Capsule by UPLC-DAD-QTOF-MS

    Directory of Open Access Journals (Sweden)

    Weina Jia

    2015-01-01

    Full Text Available Lianhua-Qingwen capsule (LQC is a commonly used Chinese medical preparation to treat viral influenza and especially played a very important role in the fight against severe acute respiratory syndrome (SARS in 2002-2003 in China. In this paper, a rapid ultraperformance liquid chromatography coupled with diode-array detector and quadrupole time-of-flight mass spectrometry (UPLC-DAD-QTOF-MS method was established for qualitative and quantitative analysis of the major constituents of LQC. A total of 61 compounds including flavonoids, phenylpropanoids, anthraquinones, triterpenoids, iridoids, and other types of compounds were unambiguously or tentatively identified by comparing the retention times and accurate mass measurement with reference compounds or literature data. Among them, twelve representative compounds were further quantified as chemical markers in quantitative analysis, including salidroside, chlorogenic acid, forsythoside E, cryptochlorogenic acid, amygdalin, sweroside, hyperin, rutin, forsythoside A, phillyrin, rhein, and glycyrrhizic acid. The UPLC-DAD method was evaluated with linearity, limit of detection (LOD, limit of quantification (LOQ, precision, stability, repeatability, and recovery tests. The results showed that the developed quantitative method was linear, sensitive, and precise for the quality control of LQC.

  19. Optimal climate policy is a utopia. From quantitative to qualitative cost-benefit analysis

    International Nuclear Information System (INIS)

    Van den Bergh, Jeroen C.J.M.

    2004-01-01

    The dominance of quantitative cost-benefit analysis (CBA) and optimality concepts in the economic analysis of climate policy is criticised. Among others, it is argued to be based in a misplaced interpretation of policy for a complex climate-economy system as being analogous to individual inter-temporal welfare optimisation. The transfer of quantitative CBA and optimality concepts reflects an overly ambitious approach that does more harm than good. An alternative approach is to focus the attention on extreme events, structural change and complexity. It is argued that a qualitative rather than a quantitative CBA that takes account of these aspects can support the adoption of a minimax regret approach or precautionary principle in climate policy. This means: implement stringent GHG reduction policies as soon as possible

  20. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  1. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  2. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  3. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  4. Total protein analysis as a reliable loading control for quantitative fluorescent Western blotting.

    Directory of Open Access Journals (Sweden)

    Samantha L Eaton

    Full Text Available Western blotting has been a key technique for determining the relative expression of proteins within complex biological samples since the first publications in 1979. Recent developments in sensitive fluorescent labels, with truly quantifiable linear ranges and greater limits of detection, have allowed biologists to probe tissue specific pathways and processes with higher resolution than ever before. However, the application of quantitative Western blotting (QWB to a range of healthy tissues and those from degenerative models has highlighted a problem with significant consequences for quantitative protein analysis: how can researchers conduct comparative expression analyses when many of the commonly used reference proteins (e.g. loading controls are differentially expressed? Here we demonstrate that common controls, including actin and tubulin, are differentially expressed in tissues from a wide range of animal models of neurodegeneration. We highlight the prevalence of such alterations through examination of published "-omics" data, and demonstrate similar responses in sensitive QWB experiments. For example, QWB analysis of spinal cord from a murine model of Spinal Muscular Atrophy using an Odyssey scanner revealed that beta-actin expression was decreased by 19.3±2% compared to healthy littermate controls. Thus, normalising QWB data to β-actin in these circumstances could result in 'skewing' of all data by ∼20%. We further demonstrate that differential expression of commonly used loading controls was not restricted to the nervous system, but was also detectable across multiple tissues, including bone, fat and internal organs. Moreover, expression of these "control" proteins was not consistent between different portions of the same tissue, highlighting the importance of careful and consistent tissue sampling for QWB experiments. Finally, having illustrated the problem of selecting appropriate single protein loading controls, we demonstrate

  5. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  6. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    International Nuclear Information System (INIS)

    Fernandez-Ruiz, R.; Garcia-Heras, M.

    2008-01-01

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies

  7. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-Ruiz, R. [Servicio Interdepartamental de Investigacion, Facultad de Ciencias, Universidad Autonoma de Madrid, Modulo C-9, Laboratorio de TXRF, Crta. Colmenar, Km 15, Cantoblanco, E-28049, Madrid (Spain)], E-mail: ramon.fernandez@uam.es; Garcia-Heras, M. [Grupo de Arqueometria de Vidrios y Materiales Ceramicos, Instituto de Historia, Centro de Ciencias Humanas y Sociales, CSIC, C/ Albasanz, 26-28, 28037 Madrid (Spain)

    2008-09-15

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies.

  8. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  9. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  10. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  11. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  12. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  13. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    Science.gov (United States)

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (Pvalue for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of

  14. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-01-01

    -friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface...... such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical...... displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics...

  15. Limitations for qualitative and quantitative neutron activation analysis using reactor neutrons

    International Nuclear Information System (INIS)

    El-Abbady, W.H.; El-Tanahy, Z.H.; El-Hagg, A.A.; Hassan, A.M.

    1999-01-01

    In this work, the most important limitations for qualitative and quantitative analysis using reactor neutrons for activation are reviewed. Each limitation is discussed using different examples of activated samples. Photopeak estimation, nuclear reactions interference and neutron flux measurements are taken into consideration. Solutions for high accuracy evaluation in neutron activation analysis applications are given. (author)

  16. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  17. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  18. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  19. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  20. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  1. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    Science.gov (United States)

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  2. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  3. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    Science.gov (United States)

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  4. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  5. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  6. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai; Tian, Yuan; Liu, Tao; Thomas, Stefani N.; Chen, Li; Schnaubelt, Michael; Boja, Emily; Hiltket, Tara; Kinsinger, Christopher; Rodriguez, Henry; Davies, Sherri; Li, Shunqiang; Snider, Jacqueline E.; Erdmann-Gilmore, Petra; Tabb, David L.; Townsend, Reid; Ellis, Matthew; Rodland, Karin D.; Smith, Richard D.; Carr, Steven A.; Zhang, Zhen; Chan, Daniel W.; Zhang, Hui

    2017-09-21

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assess the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.

  7. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  8. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  9. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    Science.gov (United States)

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  10. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  11. Visualization and quantitative analysis of the CSF pulsatile flow with cine MR phase imaging

    International Nuclear Information System (INIS)

    Katayama, Shinji; Itoh, Takahiko; Kinugasa, Kazushi; Asari, Shoji; Nishimoto, Akira; Tsuchida, Shohei; Ono, Atsushi; Ikezaki, Yoshikazu; Yoshitome, Eiji.

    1991-01-01

    The visualization and the quantitative analysis of the CSF pulsatile flow were performed on ten healthy volunteers with cine MR phase imaging, a combination of the phase-contrast technique and the cardiac-gating technique. The velocities appropriate for the visualization and the quantitative analysis of the CSF pulsatile flow were from 6.0 cm/sec to 15.0 cm/sec. The applicability of this method for the quantitative analysis was proven with a steady-flow phantom. Phase images clearly demonstrated a to-and-fro motion of the CSF flow in the anterior subarachnoid space and in the posterior subarachnoid space. The flow pattern of CSF on healthy volunteers depends on the cardiac cycle. In the anterior subarachnoid space, the cephalic CSF flow continued until a 70-msec delay after the R-wave of the ECG and then reversed to caudal. At 130-190 msec, the caudal CSF flow reached its maximum velocity; thereafter it reversed again to cephalic. The same turn appeared following the phase, but then the amplitude decreased. The cephalic peaked at 370-430 msec, while the caudal peaked at 490-550 msec. The flow pattern of the CSF flow in the posterior subarachnoid space was almost identical to that in the anterior subarachnoid space. Cine MR phase imaging is thus useful for the visualization and the quantitative analysis of the CSF pulsative flow. (author)

  12. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    Science.gov (United States)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of 30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  13. Quantitative analysis of food and feed samples with droplet digital PCR.

    Directory of Open Access Journals (Sweden)

    Dany Morisset

    Full Text Available In this study, the applicability of droplet digital PCR (ddPCR for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs. Real-time quantitative polymerase chain reaction (qPCR is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  14. Quantitative evaluation of SIMS spectra including spectrum interpretation and Saha-Eggert correction

    International Nuclear Information System (INIS)

    Steiger, W.; Ruedenauer, F.G.

    1978-01-01

    A spectrum identification program is described, using a computer algorithm which solely relies on the natural isotopic abundances for identification of elemental, molecular and cluster ions. The thermodynamic approach to the quantitative interpretation of SIMS spectra, through the use of the Saha-Eggert equation, is discussed, and a computer program is outlined. (U.K.)

  15. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...... are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph...

  16. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    Science.gov (United States)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  17. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  18. Qualitative and quantitative analysis of anthraquinones in rhubarbs by high performance liquid chromatography with diode array detector and mass spectrometry.

    Science.gov (United States)

    Wei, Shao-yin; Yao, Wen-xin; Ji, Wen-yuan; Wei, Jia-qi; Peng, Shi-qi

    2013-12-01

    Rhubarb is well known in traditional Chinese medicines (TCMs) mainly due to its effective purgative activity. Anthraquinones, including anthraquinone derivatives and their glycosides, are thought to be the major active components in rhubarb. To improve the quality control method of rhubarb, we studied on the extraction method, and did qualitative and quantitative analysis of widely used rhubarbs, Rheum tanguticum Maxim. ex Balf. and Rheum palmatum L., by HPLC-photodiode array detection (HPLC-DAD) and HPLC-mass spectrum (HPLC-MS) on a Waters SymmetryShield RP18 column (250 mm × 4.6 mm i.d., 5 μm). Amount of five anthraquinones was viewed as the evaluating standard. A standardized characteristic fingerprint of rhubarb was provided. From the quantitative analysis, the rationality was demonstrated for ancestors to use these two species of rhubarb equally. Under modern extraction methods, the amount of five anthraquinones in Rheum tanguticum Maxim. ex Balf. is higher than that in Rheum palmatum L. Among various extraction methods, ultrasonication with 70% methanol for 30 min is a promising one. For HPLC analysis, mobile phase consisted of methanol and 0.1% phosphoric acid in water with a gradient program, the detection wavelength at 280nm for fingerprinting analysis and 254 nm for quantitative analysis are good choices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Examination of quantitative accuracy of PIXE analysis for atmospheric aerosol particle samples. PIXE analysis of NIST air particulate on filter media

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Sera, Koichiro

    2005-01-01

    In order to confirm accuracy of the direct analysis of filter samples containing atmospheric aerosol particles collected on a polycarbonate membrane filter by PIXE, we carried out PIXE analysis on a National Institute of Standards and Technology (NIST, USA) air particulate on filter media (SRM 2783). For 16 elements with NIST certified values determined by PIXE analysis - Na, Mg, Al, Si, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn and Pb - quantitative values were 80-110% relative to NIST certified values except for Na, Al, Si and Ni. Quantitative values of Na, Al and Si were 140-170% relative to NIST certified values, which were all high, and Ni was 64%. One possible reason why the quantitative values of Na, Al and Si were higher than the NIST certified values could be the difference in the X-ray spectrum analysis method used. (author)

  20. ANSI/ASHRAE/IESNA Standard 90.1-2007 Final Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Richman, Eric E.; Winiarski, David W.

    2011-05-01

    The United States (U.S.) Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2007 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2004. The final analysis considered each of the 44 addenda to ANSI/ASHRAE/IESNA Standard 90.1-2004 that were included in ANSI/ASHRAE/IESNA Standard 90.1-2007. All 44 addenda processed by ASHRAE in the creation of Standard 90.1-2007 from Standard 90.1-2004 were reviewed by DOE, and their combined impact on a suite of 15 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 44 addenda, 9 were preliminarily determined to have measureable and quantifiable impact.

  1. Novel quantitative autophagy analysis by organelle flow cytometry after cell sonication.

    Directory of Open Access Journals (Sweden)

    Michael Degtyarev

    Full Text Available Autophagy is a dynamic process of bulk degradation of cellular proteins and organelles in lysosomes. Current methods of autophagy measurement include microscopy-based counting of autophagic vacuoles (AVs in cells. We have developed a novel method to quantitatively analyze individual AVs using flow cytometry. This method, OFACS (organelle flow after cell sonication, takes advantage of efficient cell disruption with a brief sonication, generating cell homogenates with fluorescently labeled AVs that retain their integrity as confirmed with light and electron microscopy analysis. These AVs could be detected directly in the sonicated cell homogenates on a flow cytometer as a distinct population of expected organelle size on a cytometry plot. Treatment of cells with inhibitors of autophagic flux, such as chloroquine or lysosomal protease inhibitors, increased the number of particles in this population under autophagy inducing conditions, while inhibition of autophagy induction with 3-methyladenine or knockdown of ATG proteins prevented this accumulation. This assay can be easily performed in a high-throughput format and opens up previously unexplored avenues for autophagy analysis.

  2. Quantitative analysis of glycated albumin in serum based on ATR-FTIR spectrum combined with SiPLS and SVM.

    Science.gov (United States)

    Li, Yuanpeng; Li, Fucui; Yang, Xinhao; Guo, Liu; Huang, Furong; Chen, Zhenqiang; Chen, Xingdan; Zheng, Shifu

    2018-08-05

    A rapid quantitative analysis model for determining the glycated albumin (GA) content based on Attenuated total reflectance (ATR)-Fourier transform infrared spectroscopy (FTIR) combining with linear SiPLS and nonlinear SVM has been developed. Firstly, the real GA content in human serum was determined by GA enzymatic method, meanwhile, the ATR-FTIR spectra of serum samples from the population of health examination were obtained. The spectral data of the whole spectra mid-infrared region (4000-600 cm -1 ) and GA's characteristic region (1800-800 cm -1 ) were used as the research object of quantitative analysis. Secondly, several preprocessing steps including first derivative, second derivative, variable standardization and spectral normalization, were performed. Lastly, quantitative analysis regression models were established by using SiPLS and SVM respectively. The SiPLS modeling results are as follows: root mean square error of cross validation (RMSECV T ) = 0.523 g/L, calibration coefficient (R C ) = 0.937, Root Mean Square Error of Prediction (RMSEP T ) = 0.787 g/L, and prediction coefficient (R P ) = 0.938. The SVM modeling results are as follows: RMSECV T  = 0.0048 g/L, R C  = 0.998, RMSEP T  = 0.442 g/L, and R p  = 0.916. The results indicated that the model performance was improved significantly after preprocessing and optimization of characteristic regions. While modeling performance of nonlinear SVM was considerably better than that of linear SiPLS. Hence, the quantitative analysis model for GA in human serum based on ATR-FTIR combined with SiPLS and SVM is effective. And it does not need sample preprocessing while being characterized by simple operations and high time efficiency, providing a rapid and accurate method for GA content determination. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  4. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    van Willigen, J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The

  5. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  6. Complex pedigree analysis to detect quantitative trait loci in dairy cattle

    NARCIS (Netherlands)

    Bink, M.C.A.M.

    1998-01-01

    In dairy cattle, many quantitative traits of economic importance show phenotypic variation. For breeding purposes the analysis of this phenotypic variation and uncovering the contribution of genetic factors is very important. Usually, the individual gene effects contributing to the

  7. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    International Nuclear Information System (INIS)

    Pořízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.

    2014-01-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  8. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    Energy Technology Data Exchange (ETDEWEB)

    Pořízka, P. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Demidov, A. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Kaiser, J. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Keivanian, J. [Institute for Mining, Technical University Clausthal, Erzstraße 18, 38678 Clausthal-Zellerfeld (Germany); Gornushkin, I. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Panne, U. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Chemistry Department, Humboldt Univerisät zu Berlin, Brook-Taylor-Straße 2, D-12489 Berlin (Germany); Riedel, J., E-mail: jens.riedel@bam.de [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany)

    2014-11-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  9. Public and patient involvement in quantitative health research: A statistical perspective.

    Science.gov (United States)

    Hannigan, Ailish

    2018-06-19

    The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.

  10. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    International Nuclear Information System (INIS)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr; Pontuschka, W M; Mamani, J B; Costa-Filho, A J; Vieira, E D

    2008-01-01

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133 + . The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133 + cells (∼6.16 x 10 5 pg in the volume of 2 μl containing 4.5 x 10 11 SPION). The quantitative method led to the result of 1.70 x 10 -13 mol of Fe (9.5 pg), or 7.0 x 10 6 nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  11. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    Science.gov (United States)

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  12. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  13. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    Science.gov (United States)

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology

  14. Quantitative twoplex glycan analysis using 12C6 and 13C6 stable isotope 2-aminobenzoic acid labelling and capillary electrophoresis mass spectrometry.

    Science.gov (United States)

    Váradi, Csaba; Mittermayr, Stefan; Millán-Martín, Silvia; Bones, Jonathan

    2016-12-01

    Capillary electrophoresis (CE) offers excellent efficiency and orthogonality to liquid chromatographic (LC) separations for oligosaccharide structural analysis. Combination of CE with high resolution mass spectrometry (MS) for glycan analysis remains a challenging task due to the MS incompatibility of background electrolyte buffers and additives commonly used in offline CE separations. Here, a novel method is presented for the analysis of 2-aminobenzoic acid (2-AA) labelled glycans by capillary electrophoresis coupled to mass spectrometry (CE-MS). To ensure maximum resolution and excellent precision without the requirement for excessive analysis times, CE separation conditions including the concentration and pH of the background electrolyte, the effect of applied pressure on the capillary inlet and the capillary length were evaluated. Using readily available 12/13 C 6 stable isotopologues of 2-AA, the developed method can be applied for quantitative glycan profiling in a twoplex manner based on the generation of extracted ion electropherograms (EIE) for 12 C 6 'light' and 13 C 6 'heavy' 2-AA labelled glycan isotope clusters. The twoplex quantitative CE-MS glycan analysis platform is ideally suited for comparability assessment of biopharmaceuticals, such as monoclonal antibodies, for differential glycomic analysis of clinical material for potential biomarker discovery or for quantitative microheterogeneity analysis of different glycosylation sites within a glycoprotein. Additionally, due to the low injection volume requirements of CE, subsequent LC-MS analysis of the same sample can be performed facilitating the use of orthogonal separation techniques for structural elucidation or verification of quantitative performance.

  15. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  16. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  17. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  18. Global analysis of the yeast osmotic stress response by quantitative proteomics

    DEFF Research Database (Denmark)

    Soufi, Boumediene; Kelstrup, C.D.; Stoehr, G.

    2009-01-01

    a comprehensive, quantitative, and time-resolved analysis using high-resolution mass spectrometry of phospho-proteome and proteome changes in response to osmotic stress in yeast. We identified 5534 unique phosphopeptide variants and 3383 yeast proteins. More than 15% of the detected phosphorylation site status...... changed more than two-fold within 5 minutes of treatment. Many of the corresponding phosphoproteins are involved in the early response to environmental stress. Surprisingly, we find that 158 regulated phosphorylation sites are potential substrates of basophilic kinases as opposed to the classical proline......-directed MAP kinase network implicated in stress response mechanisms such as p38 and HOG pathways. Proteome changes reveal an increase in abundance of more than one hundred proteins after 20 min of salt stress. Many of these are involved in the cellular response to increased osmolarity, which include proteins...

  19. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    Science.gov (United States)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  20. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  1. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  2. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  3. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  4. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  5. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  6. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  7. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    Science.gov (United States)

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  8. Three-way methods for the analysis of qualitative and quantitative two-way data.

    NARCIS (Netherlands)

    Kiers, Hendrik Albert Lambertus

    1989-01-01

    A problem often occurring in exploratory data analysis is how to summarize large numbers of variables in terms of a smaller number of dimensions. When the variables are quantitative, one may resort to Principal Components Analysis (PCA). When qualitative (categorical) variables are involved, one may

  9. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  10. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  11. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  12. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    Science.gov (United States)

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  13. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    Science.gov (United States)

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    Science.gov (United States)

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  15. Quantitative security analysis for programs with low input and noisy output

    NARCIS (Netherlands)

    Ngo, Minh Tri; Huisman, Marieke

    Classical quantitative information flow analysis often considers a system as an information-theoretic channel, where private data are the only inputs and public data are the outputs. However, for systems where an attacker is able to influence the initial values of public data, these should also be

  16. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  17. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr [Instituto Israelita de Ensino e Pesquisa Albert Einstein, IIEPAE, Sao Paulo 05651-901 (Brazil); Pontuschka, W M; Mamani, J B [Instituto de Fisica, Universidade de Sao Paulo, Sao Paulo 05315-970 (Brazil); Costa-Filho, A J; Vieira, E D [Instituto de Fisica de Sao Carlos, Universidade de Sao Paulo, Sao Carlos 13560-970 (Brazil)], E-mail: lgamarra@einstein.br

    2008-05-21

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133{sup +}. The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133{sup +} cells ({approx}6.16 x 10{sup 5} pg in the volume of 2 {mu}l containing 4.5 x 10{sup 11} SPION). The quantitative method led to the result of 1.70 x 10{sup -13} mol of Fe (9.5 pg), or 7.0 x 10{sup 6} nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  18. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  19. Quantitative analysis of rat Ig (sub)classes binding to cell surface antigens

    International Nuclear Information System (INIS)

    Nilsson, R.; Brodin, T.; Sjoegren, H.-O.

    1982-01-01

    An indirect 125 I-labeled protein A assay for detection of cell surface-bound rat immunoglobulins is presented. The assay is quantitative and rapid and detects as little as 1 ng of cell surface-bound Ig. It discriminates between antibodies belonging to different IgG subclasses, IgM and IgA. The authors describe the production and specificity control of the reagents used and show that the test can be used for quantitative analysis. A large number of sera from untreated rats are tested to evaluate the frequency of falsely positive responses and variation due to age, sex and strain of rat. With this test it is relatively easy to quantitate the binding of classes and subclasses of rat immunoglobulins in a small volume (6 μl) of untreated serum. (Auth.)

  20. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  1. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  2. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Villot-Kadri, M.; Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Michel, K.; Bruyère, D.; Laperche, V. [BRGM, Service Métrologie, Monitoring et Analyse, 3 avenue Claude Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux1.fr [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France)

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced.

  3. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Villot-Kadri, M.; Ismaël, A.; Gallou, G.; Michel, K.; Bruyère, D.; Laperche, V.; Canioni, L.; Bousquet, B.

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced

  4. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  5. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  6. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  7. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  8. Attenuated total internal reflection Fourier transform infrared spectroscopy: a quantitative approach for kidney stone analysis.

    Science.gov (United States)

    Gulley-Stahl, Heather J; Haas, Jennifer A; Schmidt, Katherine A; Evan, Andrew P; Sommer, André J

    2009-07-01

    The impact of kidney stone disease is significant worldwide, yet methods for quantifying stone components remain limited. A new approach requiring minimal sample preparation for the quantitative analysis of kidney stone components has been investigated utilizing attenuated total internal reflection Fourier transform infrared spectroscopy (ATR-FT-IR). Calcium oxalate monohydrate (COM) and hydroxylapatite (HAP), two of the most common constituents of urinary stones, were used for quantitative analysis. Calibration curves were constructed using integrated band intensities of four infrared absorptions versus concentration (weight %). The correlation coefficients of the calibration curves range from 0.997 to 0.93. The limits of detection range from 0.07 +/- 0.02% COM/HAP where COM is the analyte and HAP is the matrix, to 0.26 +/- 0.07% HAP/COM where HAP is the analyte and COM is the matrix. This study shows that linear calibration curves can be generated for the quantitative analysis of stone mixtures provided the system is well understood especially with respect to particle size.

  9. Quantitative Risk Analysis of a Pervaporation Process for Concentrating Hydrogen Peroxide

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Ho Jin; Yoon, Ik Keun [Korea Gas Corporation, Ansan (Korea, Republic of); Choi, Soo Hyoung [Chonbuk National University, Jeonju (Korea, Republic of)

    2014-12-15

    Quantitative risk analysis has been performed for a pervaporation process for production of high test peroxide. Potential main accidents are explosion and fire caused by a decomposition reaction. As the target process has a laboratory scale, the consequence is considered to belong to Category 3. An event tree has been developed as a model for occurrence of a decomposition reaction in the target process. The probability functions of the accident causes have been established based on the frequency data of similar events. Using the constructed model, the failure rate has been calculated. The result indicates that additional safety devices are required in order to achieve an acceptable risk level, i.e. an accident frequency less than 10{sup -4}/yr. Therefore, a layer of protection analysis has been applied. As a result, it is suggested to introduce inherently safer design to avoid catalytic reaction, a safety instrumented function to prevent overheating, and a relief system that prevents explosion even if a decomposition reaction occurs. The proposed method is expected to contribute to developing safety management systems for various chemical processes including concentration of hydrogen peroxide.

  10. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  11. Evaluation of the extent of ground-glass opacity on high-resolution CT in patients with interstitial pneumonia associated with systemic sclerosis: Comparison between quantitative and qualitative analysis

    International Nuclear Information System (INIS)

    Yabuuchi, H.; Matsuo, Y.; Tsukamoto, H.; Horiuchi, T.; Sunami, S.; Kamitani, T.; Jinnouchi, M.; Nagao, M.; Akashi, K.; Honda, H.

    2014-01-01

    Aim: To verify whether quantitative analysis of the extent of ground-glass opacity (GGO) on high-resolution computed tomography (HRCT) could show a stronger correlation with the therapeutic response of interstitial pneumonia (IP) associated with systemic sclerosis (SSc) compared with qualitative analysis. Materials and methods: Seventeen patients with IP associated with SSc received autologous peripheral blood stem cell transplantation (auto-PBSCT) and were followed up using HRCT and pulmonary function tests. Two thoracic radiologists assessed the extent of GGO on HRCT using a workstation. Therapeutic effect was assessed using the change of vital capacity (VC) and diffusing capacity of the lung for carbon monoxide (DLco) before and 12 months after PBSCT. Interobserver agreement was assessed using Spearman's rank correlation coefficient and the Bland–Altman method. Correlation with the therapeutic response between quantitative and qualitative analysis was assessed with Pearson's correlation coefficients. Results: Spearman's rank correlation coefficient showed good agreement, but Bland–Altman plots showed that proportional error could be suspected. Quantitative analysis showed stronger correlation than the qualitative analysis based on the relationships between the change in extent of GGO and VC, and change in extent of GGO and DLco. Conclusion: Quantitative analysis of the change in extent of GGO showed stronger correlation with the therapeutic response of IP with SSc after auto-PBSCT than with the qualitative analysis. - Highlights: • Quantitative analysis of GGO in IP showed strong correlation with therapeutic effect. • Qualitative analysis might be limited by interobserver variance. • Other parameters including reticular opacities remain in a future investigation

  12. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  13. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  14. Quantitative analysis of thorium-containing materials using an Industrial XRF analyzer

    International Nuclear Information System (INIS)

    Hasikova, J.; Titov, V.; Sokolov, A.

    2014-01-01

    Thorium (Th) as nuclear fuel is clean and safe and offers significant advantages over uranium. The technology for several types of thorium reactors is proven but still must be developed on a commercial scale. In the case of commercialization of thorium nuclear reactor thorium raw materials will be on demand. With this, mining and processing companies producing Th and rare earth elements will require prompt and reliable methods and instrumentation for Th quantitative on-line analysis. Potential applicability of X-ray fluorescence conveyor analyzer CON-X series is discussed for Th quantitative or semi-quantitative on-line measurement in several types of Th-bearing materials. Laboratory study of several minerals (zircon sands and limestone as unconventional Th resources; monazite concentrate as Th associated resources and uranium ore residues after extraction as a waste product) was performed and analyzer was tested for on-line quantitative measurements of Th contents along with other major and minor components. Th concentration range in zircon sand is 50-350 ppm; its detection limit at this level is estimated at 25- 50 ppm in 5 minute measurements depending on the type of material. On-site test of the CON-X analyzer for continuous analysis of thorium traces along with other elements in zircon sand showed that accuracy of Th measurements is within 20% relative. When Th content is higher than 1% as in the concentrate of monazite ore (5-8% ThO_2) accuracy of Th determination is within 1% relative. Although preliminary on-site test is recommended in order to address system feasibility at a large scale, provided results show that industrial conveyor XRF analyzer CON-X series can be effectively used for analytical control of mining and processing streams of Th-bearing materials. (author)

  15. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Quantitative Analysis of Tetramethylenedisulfotetramine ("Tetramine") Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Owens, J; Hok, S; Alcaraz, A; Koester, C

    2008-11-13

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD{sub 50} = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 {micro}g/mL by LC/MS/MS versus 0.15 {micro}g/mL for GC/MS. Fortifications of the beverages at 2.5 {micro}g/mL and 0.25 {micro}g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  17. Quantitative Analysis of Tetramethylenedisulfotetramine ('Tetramine') Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    International Nuclear Information System (INIS)

    Owens, J.; Hok, S.; Alcaraz, A.; Koester, C.

    2008-01-01

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD 50 = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 (micro)g/mL by LC/MS/MS versus 0.15 (micro)g/mL for GC/MS. Fortifications of the beverages at 2.5 (micro)g/mL and 0.25 (micro)g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  18. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  19. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  20. Mapping Protein-Protein Interactions by Quantitative Proteomics

    DEFF Research Database (Denmark)

    Dengjel, Joern; Kratchmarova, Irina; Blagoev, Blagoy

    2010-01-01

    spectrometry (MS)-based proteomics in combination with affinity purification protocols has become the method of choice to map and track the dynamic changes in protein-protein interactions, including the ones occurring during cellular signaling events. Different quantitative MS strategies have been used...... to characterize protein interaction networks. In this chapter we describe in detail the use of stable isotope labeling by amino acids in cell culture (SILAC) for the quantitative analysis of stimulus-dependent dynamic protein interactions.......Proteins exert their function inside a cell generally in multiprotein complexes. These complexes are highly dynamic structures changing their composition over time and cell state. The same protein may thereby fulfill different functions depending on its binding partners. Quantitative mass...

  1. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  2. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  3. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  4. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  5. Intravenous streptokinase therapy in acute myocardial infarction: Assessment of therapy effects by quantitative 201Tl myocardial imaging (including SPECT) and radionuclide ventriculography

    International Nuclear Information System (INIS)

    Koehn, H.; Bialonczyk, C.; Mostbeck, A.; Frohner, K.; Unger, G.; Steinbach, K.

    1984-01-01

    To evaluate a potential beneficial effect of systemic streptokinase therapy in acute myocardial infarction, 36 patients treated with streptokinase intravenously were assessed by radionuclide ventriculography and quantitative 201 Tl myocardial imaging (including SPECT) in comparison with 18 conventionally treated patients. Patients after thrombolysis had significantly higher EF, PFR, and PER as well as fewer wall motion abnormalities compared with controls. These differences were also observed in the subset of patients with anterior wall infarction (AMI), but not in patients with inferior wall infarction (IMI). Quantitative 201 Tl imaging demonstrated significantly smaller percent myocardial defects and fewer pathological stress segments in patients with thrombolysis compared with controls. The same differences were also found in both AMI and IMI patients. Our data suggest a favorable effect of intravenous streptokinase on recovery of left ventricular function and myocardial salvage. Radionuclide ventriculography and quantitative 201 Tl myocardial imaging seem to be reliable tools for objective assessment of therapy effects. (orig.)

  6. Quantitative analysis of Esophageal Transit of Radionuclide in Patients with Dermatomyositis-Polymyositis

    International Nuclear Information System (INIS)

    Chung, June Key; Lee, Myung Chul; Koh, Chang Soon; Lee, Myung Hae

    1989-01-01

    Esophageal transit of radionuclide was quantitatively analyzed in 29 patients with dermatomyositis-polymyositis Fourteen patients (48.3%) showed retention of tracer in oropharynx. The mean value of percent retention of oropharynx was 15.5+16.6%. Esophageal dysfunction was found in 19 patients (65.5%). Among them 4 showed mild, 12 showed moderate and 3 showed severe esophageal dysfunction. Dysphagia was found in 11 patients (37.9%), which was closely related to percent retention of oropharynx. Quantitative analysis of esophageal transit of radionuclide seemed to be a useful technique for evaluation of dysphagia in patients with dermatomyositis-polymyositis.

  7. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  8. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  9. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  10. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  11. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  12. Lateral femoral notch depth is not associated with increased rotatory instability in ACL-injured knees: a quantitative pivot shift analysis.

    Science.gov (United States)

    Kanakamedala, Ajay C; Burnham, Jeremy M; Pfeiffer, Thomas R; Herbst, Elmar; Kowalczuk, Marcin; Popchak, Adam; Irrgang, James; Fu, Freddie H; Musahl, Volker

    2018-05-01

    A deep lateral femoral notch (LFN) on lateral radiographs is indicative of ACL injury. Prior studies have suggested that a deep LFN may also be a sign of persistent rotatory instability and a concomitant lateral meniscus tear. Therefore, the purpose of this study was to evaluate the relationship between LFN depth and both quantitative measures of rotatory knee instability and the incidence of lateral meniscus tears. It was hypothesized that greater LFN depth would be correlated with increased rotatory instability, quantified by lateral compartment translation and tibial acceleration during a quantitative pivot shift test, and incidence of lateral meniscus tears. ACL-injured patients enrolled in a prospective ACL registry from 2014 to 2016 were analyzed. To limit confounders, patients were only included if they had primary ACL tears, no concurrent ligamentous or bony injuries requiring operative treatment, and no previous knee injuries or surgeries to either knee. Eighty-four patients were included in the final analysis. A standardized quantitative pivot shift test was performed pre-operatively under anesthesia in both knees, and rotatory instability, specifically lateral compartment translation and tibial acceleration, was quantified using tablet image analysis software and accelerometer sensors. Standard lateral radiographs and sagittal magnetic resonance images (MRI) of the injured knee were evaluated for LFN depth. There were no significant correlations between LFN depth on either imaging modality and ipsilateral lateral compartment translation or tibial acceleration during a quantitative pivot shift test or side-to-side differences in these measurements. Patients with lateral meniscus tears were found to have significantly greater LFN depths than those without on conventional radiograph and MRI (1.0 vs. 0.6 mm, p quantitative measures of rotatory instability. Concomitant lateral meniscus injury was associated with significantly greater LFN depth. Based on

  13. Quantitative iTRAQ secretome analysis of Aspergillus niger reveals novel hydrolytic enzymes.

    Science.gov (United States)

    Adav, Sunil S; Li, An A; Manavalan, Arulmani; Punt, Peter; Sze, Siu Kwan

    2010-08-06

    The natural lifestyle of Aspergillus niger made them more effective secretors of hydrolytic proteins and becomes critical when this species were exploited as hosts for the commercial secretion of heterologous proteins. The protein secretion profile of A. niger and its mutant at different pH was explored using iTRAQ-based quantitative proteomics approach coupled with liquid chromatography-tandem mass spectrometry (LC-MS/MS). This study characterized 102 highly confident unique proteins in the secretome with zero false discovery rate based on decoy strategy. The iTRAQ technique identified and relatively quantified many hydrolyzing enzymes such as cellulases, hemicellulases, glycoside hydrolases, proteases, peroxidases, and protein translocating transporter proteins during fermentation. The enzymes have potential application in lignocellulosic biomass hydrolysis for biofuel production, for example, the cellulolytic and hemicellulolytic enzymes glucan 1,4-alpha-glucosidase, alpha-glucosidase C, endoglucanase, alpha l-arabinofuranosidase, beta-mannosidase, glycosyl hydrolase; proteases such as tripeptidyl-peptidase, aspergillopepsin, and other enzymes including cytochrome c oxidase, cytochrome c oxidase, glucose oxidase were highly expressed in A. niger and its mutant secretion. In addition, specific enzyme production can be stimulated by controlling pH of the culture medium. Our results showed comprehensive unique secretory protein profile of A. niger, its regulation at different pH, and the potential application of iTRAQ-based quantitative proteomics for the microbial secretome analysis.

  14. Total quantitative recording of elemental maps and spectra with a scanning microprobe

    International Nuclear Information System (INIS)

    Legge, G.J.F.; Hammond, I.

    1979-01-01

    A system of data recording and analysis has been developed by means of which simultaneously all data from a scanning instrument such as a microprobe can be quantitatively recorded and permanently stored, including spectral outputs from several detectors. Only one scanning operation is required on the specimen. Analysis is then performed on the stored data, which contain quantitative information on distributions of all elements and spectra of all regions

  15. Physical aspects of quantitative particles analysis by X-ray fluorescence and electron microprobe techniques

    International Nuclear Information System (INIS)

    Markowicz, A.

    1986-01-01

    The aim of this work is to present both physical fundamentals and recent advances in quantitative particles analysis by X-ray fluorescence (XRF) and electron microprobe (EPXMA) techniques. A method of correction for the particle-size effect in XRF analysis is described and theoretically evaluated. New atomic number- and absorption correction procedures in EPXMA of individual particles are proposed. The applicability of these two correction methods is evaluated for a wide range of elemental composition, X-ray energy and sample thickness. Also, a theoretical model for composition and thickness dependence of Bremsstrahlung background generated in multielement bulk specimens as well as thin films and particles are presented and experimantally evaluated. Finally, the limitations and further possible improvements in quantitative particles analysis by XFR and EPXMA are discussed. 109 refs. (author)

  16. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    CERN Document Server

    Vekemans, B; Somogyi, A; Drakopoulos, M; Kempenaers, L; Simionovici, A; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative u...

  17. The Study on the Quantitative Analysis in LPG Tank's Fire and Explosion

    Energy Technology Data Exchange (ETDEWEB)

    Bae, S.J.; Kim, B.J. [Department of chemical Engineering, Soongsil University, Seoul (Korea)

    1999-04-01

    Chemical plant's fire and explosion does not only damage to the chemical plants themselves but also damage to people in or near of the accident spot and the neighborhood of chemical plant. For that reason, Chemical process safety management has become important. One of safety management methods is called 'the quantitative analysis', which is used to reduce and prevent the accident. The results of the quantitative analysis could be used to arrange the equipments, evaluate the minimum safety distance, prepare the safety equipments. In this study we make the computer program to make easy to do quantitative analysis of the accident. The output of the computer program is the magnitude of fire(pool fire and fireball) and explosion (UVCE and BLEVE) effects. We used the thermal radiation as a measure of fire magnitude and used the overpressure as a measure of explosion magnitude. In case of BLEVE, the fly distance of fragment can be evaluated. Also probit analysis was done in every case. As the case study, Buchun LPG explosion accident in Korea was analysed by the program developed. The simulation results showed that the permissible distance was 800m and probit analysis showed that 1st degree burn, 2nd degree burn, and death distances are 450, 280, 260m, respectively. the simulation results showed the good agreement with the result from SAFER PROGRAM made by DuPont. 13 refs., 4 figs., 2 tabs.

  18. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  19. Quantitative Motion Analysis of Tai Chi Chuan: The Upper Extremity Movement

    Directory of Open Access Journals (Sweden)

    Tsung-Jung Ho

    2018-01-01

    Full Text Available The quantitative and reproducible analysis of the standard body movement in Tai Chi Chuan (TCC was performed in this study. We aimed to provide a reference of the upper extremities for standardizing TCC practice. Microsoft Kinect was used to record the motion during the practice of TCC. The preparation form and eight essential forms of TCC performed by an instructor and 101 practitioners were analyzed in this study. The instructor completed an entire TCC practice cycle and performed the cycle 12 times. An entire cycle of TCC was performed by practitioners and images were recorded for statistics analysis. The performance of the instructor showed high similarity (Pearson correlation coefficient (r=0.71~0.84 to the first practice cycle. Among the 9 forms, lay form had the highest similarity (rmean=0.90 and push form had the lowest similarity (rmean=0.52. For the practitioners, ward off form (rmean=0.51 and roll back form (rmean=0.45 had the highest similarity with moderate correlation. We used Microsoft Kinect to record the spatial coordinates of the upper extremity joints during the practice of TCC and the data to perform quantitative and qualitative analysis of the joint positions and elbow joint angle.

  20. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  1. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  2. A quantitative impact analysis of sensor failures on human operator's decision making in nuclear power plants

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    2004-01-01

    In emergency or accident situations in nuclear power plants, human operators take important roles in generating appropriate control signals to mitigate accident situation. In human reliability analysis (HRA) in the framework of probabilistic safety assessment (PSA), the failure probabilities of such appropriate actions are estimated and used for the safety analysis of nuclear power plants. Even though understanding the status of the plant is basically the process of information seeking and processing by human operators, it seems that conventional HRA methods such as THERP, HCR, and ASEP does not pay a lot of attention to the possibilities of providing wrong information to human operators. In this paper, a quantitative impact analysis of providing wrong information to human operators due to instrument faults or sensor failures is performed. The quantitative impact analysis is performed based on a quantitative situation assessment model. By comparing the situation in which there are sensor failures and the situation in which there are not sensor failures, the impact of sensor failures can be evaluated quantitatively. It is concluded that the impact of sensor failures are quite significant at the initial stages, but the impact is gradually reduced as human operators make more and more observations. Even though the impact analysis is highly dependent on the situation assessment model, it is expected that the conclusions made based on other situation assessment models with be consistent with the conclusion made in this paper. (author)

  3. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  4. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  5. Comparison of Variable Number Tandem Repeat and Short Tandem Repeat Genetic Markers for Qualitative and Quantitative Chimerism Analysis Post Allogeneic Stem Cell Transplantation

    International Nuclear Information System (INIS)

    Mossallam, G.I.; Smith, A.G.; Mcfarland, C.

    2005-01-01

    Analysis of donor chimerism has become a routine procedure for the documentation of engraftment after allogeneic hematopoietic stem cell transplantation. Quantitative analysis of chimerism kinetics has been shown to predict graft failure or relapse. In this study, we compared the use of variable number tandem repeats (VNTR) and short tandem repeats (STR) as polymorphic genetic markers in chimerism analysis. This study included qualitative and quantitative assessment of both techniques to assess informative yield and sensitivity. Patients and Methods: We analyzed 206 samples representing 40 transplant recipients and their HLA identical sibling donors. A panel of six VNTR loci, 15 STR loci and 1 sex chromosome locus was used. Amplified VNTR products were visualized in an ethidium bromide stained gel. STR loci were amplified using fluorescent primers, and the products were analyzed by capillary electrophoresis. VNTR and STR analysis gave comparable qualitative results in the majority of cases. The incidence of mixed chimerism (Me) by STR analysis was 45% compared to 32% in cases evaluated by VNTR analysis. STR markers were more informative; several informative loci could be identified in all patients. Unique alleles for both patient and donor could be identified in all patients by STR versus 32/40 by VNTR analysis. The STR markers were also more sensitive in the detection of chimerism. The size of VNTR alleles and differences between the size of donor and recipient VNTR alleles affected the sensitivity of detection. With both techniques, quantitative assessment of chimerism showed some discrepancies between the estimated and the calculated percentage of donor DNA. Discordance between the two estimates was observed in 8/19 patients with Me. However, sequential monitoring of the relative band intensity of VNTR alleles offered some insight into the direction of change in engraftment over time. The higher yield of informative loci with STR and the automated measurement of

  6. Quantitative Analysis of Human Pluripotency and Neural Specification by In-Depth (PhosphoProteomic Profiling

    Directory of Open Access Journals (Sweden)

    Ilyas Singec

    2016-09-01

    Full Text Available Controlled differentiation of human embryonic stem cells (hESCs can be utilized for precise analysis of cell type identities during early development. We established a highly efficient neural induction strategy and an improved analytical platform, and determined proteomic and phosphoproteomic profiles of hESCs and their specified multipotent neural stem cell derivatives (hNSCs. This quantitative dataset (nearly 13,000 proteins and 60,000 phosphorylation sites provides unique molecular insights into pluripotency and neural lineage entry. Systems-level comparative analysis of proteins (e.g., transcription factors, epigenetic regulators, kinase families, phosphorylation sites, and numerous biological pathways allowed the identification of distinct signatures in pluripotent and multipotent cells. Furthermore, as predicted by the dataset, we functionally validated an autocrine/paracrine mechanism by demonstrating that the secreted protein midkine is a regulator of neural specification. This resource is freely available to the scientific community, including a searchable website, PluriProt.

  7. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  8. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  9. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    International Nuclear Information System (INIS)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi; Somerville, Robert A.; Kitamoto, Tetsuyuki; Mohri, Shirou

    2013-01-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products

  10. Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao; Murayama, Yuichi; Yokoyama, Takashi [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan); Somerville, Robert A. [The Roslin Institute and Royal (Dick) School of Veterinary Studies, Roslin, Midlothian, EH25 9PS (United Kingdom); Kitamoto, Tetsuyuki [Division of CJD Science and Technology, Department of Prion Research, Center for Translational and Advanced Animal Research on Human Diseases, Tohoku University Graduate School of Medicine, 2-1 Seiryo, Aoba, Sendai 980-8575 (Japan); Mohri, Shirou, E-mail: shirou@affrc.go.jp [Prion Disease Research Center, National Institute of Animal Health, 3-1-5 Kannondai, Tsukuba, Ibaraki 305-0856 (Japan)

    2013-03-01

    Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using a highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products.

  11. Quantitative analysis of iodine in thyroidin. I. Methods of ''dry'' and ''wet'' mineralization

    International Nuclear Information System (INIS)

    Listov, S.A.; Arzamastsev, A.P.

    1986-01-01

    The relative investigations on the quantitative determination of iodine in thyroidin using different modifications of the ''dry'' and ''wet'' mineralization show that in using these methods the difficulties due to the characteristic features of the object of investigation itself and the mineralization method as a whole must be taken into account. The studies show that the most applicable method for the analysis of thyroidin is the method of ''dry'' mineralization with potassium carbonate. A procedure is proposed for a quantitative determination of iodine in thyroidin

  12. Quantitative evaluation of fluctuation error in X-ray diffraction profiles with fractal analysis

    International Nuclear Information System (INIS)

    Kurose, Masashi; Hirose, Yukio; Sasaki, Toshihiko; Yoshioka, Yasuo.

    1995-01-01

    A method of the fractal analysis was applied to the diffraction profiles for its quantitative evaluation. The fractal dimension was analyzed according to both Box counting method and FFT method. The relationship between the fractal dimension and the measurement criteria in X-ray diffraction analysis was discussed with diffraction data obtained under various conditions of the measurement. It was concluded that the fractal analysis is effective for the quantitative evaluation of diffraction data. Box counting method is suitable for evaluation of a whole profile, and FFT method is for that of a fundamental profile. The range of desirable condition of measurement is 1.0≤D≤1.2, where D is a fractal dimension. The appropriate range of measurement becomes 0.01≤Sw/HVB≤0.03, where Sw is the step width and the HVB is the half-value breadth. Stresses with higher precision were obtained from measurements under this new criteria. (author)

  13. Severity of pulmonary emphysema and lung cancer: analysis using quantitative lobar emphysema scoring.

    Science.gov (United States)

    Bae, Kyungsoo; Jeon, Kyung Nyeo; Lee, Seung Jun; Kim, Ho Cheol; Ha, Ji Young; Park, Sung Eun; Baek, Hye Jin; Choi, Bo Hwa; Cho, Soo Buem; Moon, Jin Il

    2016-11-01

    The aim of this study was to determine the relationship between lobar severity of emphysema and lung cancer using automated lobe segmentation and emphysema quantification methods.This study included 78 patients (74 males and 4 females; mean age of 72 years) with the following conditions: pathologically proven lung cancer, available chest computed tomographic (CT) scans for lobe segmentation, and quantitative scoring of emphysema. The relationship between emphysema and lung cancer was analyzed using quantitative emphysema scoring of each pulmonary lobe.The most common location of cancer was the left upper lobe (LUL) (n = 28), followed by the right upper lobe (RUL) (n = 27), left lower lobe (LLL) (n = 13), right lower lobe (RLL) (n = 9), and right middle lobe (RML) (n = 1). Emphysema ratio was the highest in LUL, followed by that in RUL, LLL, RML, and RLL. Multivariate logistic regression analysis revealed that upper lobes (odds ratio: 1.77; 95% confidence interval: 1.01-3.11, P = 0.048) and lobes with emphysema ratio ranked the 1st or the 2nd (odds ratio: 2.48; 95% confidence interval: 1.48-4.15, P emphysema patients, lung cancer has a tendency to develop in lobes with more severe emphysema.

  14. Quantitative analysis of pulmonary perfusion using time-resolved parallel 3D MRI - initial results

    International Nuclear Information System (INIS)

    Fink, C.; Buhmann, R.; Plathow, C.; Puderbach, M.; Kauczor, H.U.; Risse, F.; Ley, S.; Meyer, F.J.

    2004-01-01

    Purpose: to assess the use of time-resolved parallel 3D MRI for a quantitative analysis of pulmonary perfusion in patients with cardiopulmonary disease. Materials and methods: eight patients with pulmonary embolism or pulmonary hypertension were examined with a time-resolved 3D gradient echo pulse sequence with parallel imaging techniques (FLASH 3D, TE/TR: 0.8/1.9 ms; flip angle: 40 ; GRAPPA). A quantitative perfusion analysis based on indicator dilution theory was performed using a dedicated software. Results: patients with pulmonary embolism or chronic thromboembolic pulmonary hypertension revealed characteristic wedge-shaped perfusion defects at perfusion MRI. They were characterized by a decreased pulmonary blood flow (PBF) and pulmonary blood volume (PBV) and increased mean transit time (MTT). Patients with primary pulmonary hypertension or eisenmenger syndrome showed a more homogeneous perfusion pattern. The mean MTT of all patients was 3.3 - 4.7 s. The mean PBF and PBV showed a broader interindividual variation (PBF: 104-322 ml/100 ml/min; PBV: 8 - 21 ml/100 ml). Conclusion: time-resolved parallel 3D MRI allows at least a semi-quantitative assessment of lung perfusion. Future studies will have to assess the clinical value of this quantitative information for the diagnosis and management of cardiopulmonary disease. (orig.) [de

  15. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  16. Quantitative analysis by microchip capillary electrophoresis – current limitations and problem-solving strategies

    NARCIS (Netherlands)

    Revermann, T.; Götz, S.; Künnemeyer, Jens; Karst, U.

    2008-01-01

    Obstacles and possible solutions for the application of microchip capillary electrophoresis in quantitative analysis are described and critically discussed. Differences between the phenomena occurring during conventional capillary electrophoresis and microchip-based capillary electrophoresis are

  17. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  18. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  19. Quantitative analysis of Moessbauer backscatter spectra from multilayer films

    International Nuclear Information System (INIS)

    Bainbridge, J.

    1975-01-01

    The quantitative interpretation of Moessbauer backscatter spectra with particular reference to internal conversion electrons has been treated assuming that electron attenuation in a surface film can be satisfactorily described by a simple exponential law. The theory of Krakowski and Miller has been extended to include multi-layer samples, and a relation between the Moessbauer spectrum area and an individual layer thickness derived. As an example, numerical results are obtained for a duplex oxide film grown on pure iron. (Auth.)

  20. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  1. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  2. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  3. Functional characterization and quantitative expression analysis of two GnRH-related peptide receptors in the mosquito, Aedes aegypti.

    Science.gov (United States)

    Oryan, Alireza; Wahedi, Azizia; Paluzzi, Jean-Paul V

    2018-03-04

    To cope with stressful events such as flight, organisms have evolved various regulatory mechanisms, often involving control by endocrine-derived factors. In insects, two stress-related factors include the gonadotropin-releasing hormone-related peptides adipokinetic hormone (AKH) and corazonin (CRZ). AKH is a pleiotropic hormone best known as a substrate liberator of proteins, lipids, and carbohydrates. Although a universal function has not yet been elucidated, CRZ has been shown to have roles in pigmentation, ecdysis or act as a cardiostimulatory factor. While both these neuropeptides and their respective receptors (AKHR and CRZR) have been characterized in several organisms, details on their specific roles within the disease vector, Aedes aegypti, remain largely unexplored. Here, we obtained three A. aegypti AKHR transcript variants and further identified the A. aegypti CRZR receptor. Receptor expression using a heterologous functional assay revealed that these receptors exhibit a highly specific response for their native ligands. Developmental quantitative expression analysis of CRZR revealed enrichment during the pupal and adult stages. In adults, quantitative spatial expression analysis revealed CRZR transcript in a variety of organs including head, thoracic ganglia, primary reproductive organs (ovary and testis), as well as male carcass. This suggest CRZ may play a role in ecdysis, and neuronal expression of CRZR indicates a possible role for CRZ within the nervous system. Quantitative developmental expression analysis of AKHR identified significant transcript enrichment in early adult stages. AKHR transcript was observed in the head, thoracic ganglia, accessory reproductive tissues and the carcass of adult females, while it was detected in the abdominal ganglia and enriched significantly in the carcass of adult males, which supports the known function of AKH in energy metabolism. Collectively, given the enrichment of CRZR and AKHR in the primary and

  4. A Content Analysis of Quantitative Research in Journal of Marital and Family Therapy: A 10-Year Review.

    Science.gov (United States)

    Parker, Elizabeth O; Chang, Jennifer; Thomas, Volker

    2016-01-01

    We examined the trends of quantitative research over the past 10 years in the Journal of Marital and Family Therapy (JMFT). Specifically, within the JMFT, we investigated the types and trends of research design and statistical analysis within the quantitative research that was published in JMFT from 2005 to 2014. We found that while the amount of peer-reviewed articles have increased over time, the percentage of quantitative research has remained constant. We discussed the types and trends of statistical analysis and the implications for clinical work and training programs in the field of marriage and family therapy. © 2016 American Association for Marriage and Family Therapy.

  5. Quantitative analysis by laser-induced breakdown spectroscopy based on generalized curves of growth

    Energy Technology Data Exchange (ETDEWEB)

    Aragón, C., E-mail: carlos.aragon@unavarra.es; Aguilera, J.A.

    2015-08-01

    A method for quantitative elemental analysis by laser-induced breakdown spectroscopy (LIBS) is proposed. The method (Cσ-LIBS) is based on Cσ graphs, generalized curves of growth which allow including several lines of various elements at different concentrations. A so-called homogeneous double (HD) model of the laser-induced plasma is used, defined by an integration over a single-region of the radiative transfer equation, combined with a separated treatment for neutral atoms (z = 0) and singly-charged ions (z = 1) in Cσ graphs and characteristic parameters. The procedure includes a criterion, based on a model limit, for eliminating data which, due to a high line intensity or concentration, are not well described by the HD model. An initial procedure provides a set of parameters (βA){sup z}, (ηNl){sup z}, T{sup z} and N{sub e}{sup z} (z = 0, 1) which characterize the plasma and the LIBS system. After characterization, two different analytical procedures, resulting in relative and absolute concentrations, may be applied. To test the method, fused glass samples prepared from certified slags and pure compounds are analyzed. We determine concentrations of Ca, Mn, Mg, V, Ti, Si and Al relative to Fe in three samples prepared from slags, and absolute concentrations of Fe, Ca and Mn in three samples prepared from Fe{sub 2}O{sub 3}, CaCO{sub 3} and Mn{sub 2}O{sub 3}. The accuracy obtained is 3.2% on the average for relative concentrations and 9.2% for absolute concentrations. - Highlights: • Method for quantitative analysis by LIBS, based on Csigma graphs • Conventional calibration is replaced with characterization of the LIBS system. • All elements are determined from measurement of one or two Csigma graphs. • The method is tested with fused glass disks prepared from slags and pure compounds. • Accurate results for relative (3.2%) and absolute concentrations (9.2%)

  6. Simple preparation of plant epidermal tissue for laser microdissection and downstream quantitative proteome and carbohydrate analysis

    Directory of Open Access Journals (Sweden)

    Christian eFalter

    2015-03-01

    Full Text Available The outwardly directed cell wall and associated plasma membrane of epidermal cells represent the first layers of plant defense against intruding pathogens. Cell wall modifications and the formation of defense structures at sites of attempted pathogen penetration are decisive for plant defense. A precise isolation of these stress-induced structures would allow a specific analysis of regulatory mechanism and cell wall adaption. However, methods for large-scale epidermal tissue preparation from the model plant Arabidopsis thaliana, which would allow proteome and cell wall analysis of complete, laser-microdissected epidermal defense structures, have not been provided. We developed the adhesive tape – liquid cover glass technique for simple leaf epidermis preparation from A. thaliana, which is also applicable on grass leaves. This method is compatible with subsequent staining techniques to visualize stress-related cell wall structures, which were precisely isolated from the epidermal tissue layer by laser microdissection coupled to laser pressure catapulting. We successfully demonstrated that these specific epidermal tissue samples could be used for quantitative downstream proteome and cell wall analysis. The development of the adhesive tape – liquid cover glass technique for simple leaf epidermis preparation and the compatibility to laser microdissection and downstream quantitative analysis opens new possibilities in the precise examination of stress- and pathogen-related cell wall structures in epidermal cells. Because the developed tissue processing is also applicable on A. thaliana, well-established, model pathosystems that include the interaction with powdery mildews can be studied to determine principal regulatory mechanisms in plant-microbe interaction with their potential outreach into crop breeding.

  7. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    Science.gov (United States)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  8. Quantitative analysis with energy dispersive X-ray fluorescence analyser

    International Nuclear Information System (INIS)

    Kataria, S.K.; Kapoor, S.S.; Lal, M.; Rao, B.V.N.

    1977-01-01

    Quantitative analysis of samples using radioisotope excited energy dispersive x-ray fluorescence system is described. The complete set-up is built around a locally made Si(Li) detector x-ray spectrometer with an energy resolution of 220 eV at 5.94 KeV. The photopeaks observed in the x-ray fluorescence spectra are fitted with a Gaussian function and the intensities of the characteristic x-ray lines are extracted, which in turn are used for calculating the elemental concentrations. The results for a few typical cases are presented. (author)

  9. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    International Nuclear Information System (INIS)

    Vekemans, B.; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F.

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem

  10. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Vekemans, B. E-mail: vekemans@uia.ua.ac.be; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem.

  11. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  12. Sampling of illicit drugs for quantitative analysis--part II. Study of particle size and its influence on mass reduction.

    Science.gov (United States)

    Bovens, M; Csesztregi, T; Franc, A; Nagy, J; Dujourdy, L

    2014-01-01

    The basic goal in sampling for the quantitative analysis of illicit drugs is to maintain the average concentration of the drug in the material from its original seized state (the primary sample) all the way through to the analytical sample, where the effect of particle size is most critical. The size of the largest particles of different authentic illicit drug materials, in their original state and after homogenisation, using manual or mechanical procedures, was measured using a microscope with a camera attachment. The comminution methods employed included pestle and mortar (manual) and various ball and knife mills (mechanical). The drugs investigated were amphetamine, heroin, cocaine and herbal cannabis. It was shown that comminution of illicit drug materials using these techniques reduces the nominal particle size from approximately 600 μm down to between 200 and 300 μm. It was demonstrated that the choice of 1 g increments for the primary samples of powdered drugs and cannabis resin, which were used in the heterogeneity part of our study (Part I) was correct for the routine quantitative analysis of illicit seized drugs. For herbal cannabis we found that the appropriate increment size was larger. Based on the results of this study we can generally state that: An analytical sample weight of between 20 and 35 mg of an illicit powdered drug, with an assumed purity of 5% or higher, would be considered appropriate and would generate an RSDsampling in the same region as the RSDanalysis for a typical quantitative method of analysis for the most common, powdered, illicit drugs. For herbal cannabis, with an assumed purity of 1% THC (tetrahydrocannabinol) or higher, an analytical sample weight of approximately 200 mg would be appropriate. In Part III we will pull together our homogeneity studies and particle size investigations and use them to devise sampling plans and sample preparations suitable for the quantitative instrumental analysis of the most common illicit

  13. Quantitative analysis of Chiari-like malformation and syringomyelia in the Griffon Bruxellois dog.

    Directory of Open Access Journals (Sweden)

    Susan P Knowler

    Full Text Available This study aimed to develop a system of quantitative analysis of canine Chiari-like malformation and syringomyelia on variable quality MRI. We made a series of measurements from magnetic resonance DICOM images from Griffon Bruxellois dogs with and without Chiari-like malformation and syringomyelia and identified several significant variables. We found that in the Griffon Bruxellois dog, Chiari-like malformation is characterized by an apparent shortening of the entire cranial base and possibly by increased proximity of the atlas to the occiput. As a compensatory change, there appears to be an increased height of the rostral cranial cavity with lengthening of the dorsal cranial vault and considerable reorganization of the brain parenchyma including ventral deviation of the olfactory bulbs and rostral invagination of the cerebellum under the occipital lobes.

  14. A meta-analysis including dose-response relationship between night shift work and the risk of colorectal cancer.

    Science.gov (United States)

    Wang, Xiao; Ji, Alin; Zhu, Yi; Liang, Zhen; Wu, Jian; Li, Shiqi; Meng, Shuai; Zheng, Xiangyi; Xie, Liping

    2015-09-22

    A meta-analysis was conducted to quantitatively evaluate the correlation between night shift work and the risk of colorectal cancer. We searched for publications up to March 2015 using PubMed, Web of Science, Cochrane Library, EMBASE and the Chinese National Knowledge Infrastructure databases, and the references of the retrieved articles and relevant reviews were also checked. OR and 95% CI were used to assess the degree of the correlation between night shift work and risk of colorectal cancer via fixed- or random-effect models. A dose-response meta-analysis was performed as well. The pooled OR estimates of the included studies illustrated that night shift work was correlated with an increased risk of colorectal cancer (OR = 1.318, 95% CI 1.121-1.551). No evidence of publication bias was detected. In the dose-response analysis, the rate of colorectal cancer increased by 11% for every 5 years increased in night shift work (OR = 1.11, 95% CI 1.03-1.20). In conclusion, this meta-analysis indicated that night shift work was associated with an increased risk of colorectal cancer. Further researches should be conducted to confirm our findings and clarify the potential biological mechanisms.

  15. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  16. Pilot study of quantitative analysis of background enhancement on breast MR images: association with menstrual cycle and mammographic breast density.

    Science.gov (United States)

    Scaranelo, Anabel M; Carrillo, Maria Claudia; Fleming, Rachel; Jacks, Lindsay M; Kulkarni, Supriya R; Crystal, Pavel

    2013-06-01

    To perform semiautomated quantitative analysis of the background enhancement (BE) in a cohort of patients with newly diagnosed breast cancer and to correlate it with mammographic breast density and menstrual cycle. Informed consent was waived after the research ethics board approved this study. Results of 177 consecutive preoperative breast magnetic resonance (MR) examinations performed from February to December 2009 were reviewed; 147 female patients (median age, 48 years; range, 26-86 years) were included. Ordinal values of BE and breast density were described by two independent readers by using the Breast Imaging Reporting and Data System lexicon. The BE coefficient (BEC) was calculated thus: (SI2 · 100/SI1) - 100, where SI is signal intensity, SI2 is the SI enhancement measured in the largest anteroposterior dimension in the axial plane 1 minute after the contrast agent injection, and SI1is the SI before contrast agent injection. BEC was used for the quantitative analysis of BE. Menstrual cycle status was based on the last menstrual period. The Wilcoxon rank-sum or Kruskal-Wallis test was used to compare quantitative assessment groups. Cohen weighted κ was used to evaluate agreement. Of 147 patients, 68 (46%) were premenopausal and 79 (54%) were postmenopausal. The quantitative BEC was associated with the menstrual status (BEC in premenopausal women, 31.48 ± 20.68 [standard deviation]; BEC in postmenopausal women, 25.65 ± 16.74; P = .02). The percentage of overall BE was higher when the MR imaging was performed in women in the inadequate phase of the cycle (quantitative BE than postmenopausal women. No association was found between BE and breast density.

  17. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    Science.gov (United States)

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  18. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  19. The Digital Image Processing And Quantitative Analysis In Microscopic Image Characterization

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2000-01-01

    Many electron microscopes although have produced digital images, but not all of them are equipped with a supporting unit to process and analyse image data quantitatively. Generally the analysis of image has to be made visually and the measurement is realized manually. The development of mathematical method for geometric analysis and pattern recognition, allows automatic microscopic image analysis with computer. Image processing program can be used for image texture and structure periodic analysis by the application of Fourier transform. Because the development of composite materials. Fourier analysis in frequency domain become important for measure the crystallography orientation. The periodic structure analysis and crystal orientation are the key to understand many material properties like mechanical strength. stress, heat conductivity, resistance, capacitance and other material electric and magnetic properties. In this paper will be shown the application of digital image processing in microscopic image characterization and analysis in microscopic image

  20. HPTLC Hyphenated with FTIR: Principles, Instrumentation and Qualitative Analysis and Quantitation

    Science.gov (United States)

    Cimpoiu, Claudia

    In recent years, much effort has been devoted to the coupling of high-performance thin-layer chromatography (HPTLC) with spectrometric methods because of the robustness and simplicity of HPTLC and the need for detection techniques that provide identification and determination of sample constituents. IR is one of the spectroscopic methods that have been coupled with HPTLC. IR spectroscopy has a high potential for the elucidation of molecular structures, and the characteristic absorption bands can be used for compound-specific detection. HPTLC-FTIR coupled method has been widely used in the modern laboratories for the qualitative and quantitative analysis. The potential of this method is demonstrated by its application in different fields of analysis such as drug analysis, forensic analysis, food analysis, environmental analysis, biological analysis, etc. The hyphenated HPTLC-FTIR technique will be developed in the future with the aim of taking full advantage of this method.

  1. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  2. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  3. Quantitative diagnosis of bladder cancer by morphometric analysis of HE images

    Science.gov (United States)

    Wu, Binlin; Nebylitsa, Samantha V.; Mukherjee, Sushmita; Jain, Manu

    2015-02-01

    In clinical practice, histopathological analysis of biopsied tissue is the main method for bladder cancer diagnosis and prognosis. The diagnosis is performed by a pathologist based on the morphological features in the image of a hematoxylin and eosin (HE) stained tissue sample. This manuscript proposes algorithms to perform morphometric analysis on the HE images, quantify the features in the images, and discriminate bladder cancers with different grades, i.e. high grade and low grade. The nuclei are separated from the background and other types of cells such as red blood cells (RBCs) and immune cells using manual outlining, color deconvolution and image segmentation. A mask of nuclei is generated for each image for quantitative morphometric analysis. The features of the nuclei in the mask image including size, shape, orientation, and their spatial distributions are measured. To quantify local clustering and alignment of nuclei, we propose a 1-nearest-neighbor (1-NN) algorithm which measures nearest neighbor distance and nearest neighbor parallelism. The global distributions of the features are measured using statistics of the proposed parameters. A linear support vector machine (SVM) algorithm is used to classify the high grade and low grade bladder cancers. The results show using a particular group of nuclei such as large ones, and combining multiple parameters can achieve better discrimination. This study shows the proposed approach can potentially help expedite pathological diagnosis by triaging potentially suspicious biopsies.

  4. Hand Fatigue Analysis Using Quantitative Evaluation of Variability in Drawing Patterns

    Directory of Open Access Journals (Sweden)

    mohamadali Sanjari

    2015-02-01

    Full Text Available Background & aim: Muscle fatigue is defined as the reduced power generation capacity of a muscle or muscle group after activity which can lead to a variety of lesions. The purpose of the present study was to define the fatigue analysis by quantitative analysis using drawing patterns. Methods: the present cross-sectional study was conducted on 37 healthy volunteers (6 men and 31 women aged 18-30 years. Before & immediately after a fatigue protocol, quantitative assessment of hand drawing skills was performed by drawing repeated, overlapping, and concentric circles. The test was conducted in three sessions with an interval of 48-72 hours. Drawing was recorded by a digital tablet. Data were statistically analyzed using paired t-test and repeated measure ANOVA. Result: In drawing time series data analysis, at fatigue level of 100%, the variables standard deviation along x axis (SDx, standard deviation of velocity on both x and y axis (SDVx and SDVy and resultant vector velocity standard deviation (SDVR, showed significant differences after fatigue (P<0.05. In comparison of variables after the three fatigue levels, SDx showed significant difference (P<0.05. Conclusions: structurally full fatigue showed significant differences with other levels of fatigue, so it contributed to significant variability in drawing parameters. The method used in the present study recognized the fatigue in high frequency motion as well.

  5. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  6. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  7. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  8. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  9. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  10. Economic analysis of light brown apple moth using GIS and quantitative modeling

    Science.gov (United States)

    Glenn Fowler; Lynn Garrett; Alison Neeley; Roger Magarey; Dan Borchert; Brian. Spears

    2011-01-01

    We conducted an economic analysis of the light brown apple moth (LBAM), (piphyas postvittana (Walker)), whose presence in California has resulted in a regulatory program. Our objective was to quantitatively characterize the economic costs to apple, grape, orange, and pear crops that would result from LBAM's introduction into the continental...

  11. Meta- and statistical analysis of single-case intervention research data: quantitative gifts and a wish list.

    Science.gov (United States)

    Kratochwill, Thomas R; Levin, Joel R

    2014-04-01

    In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  12. Rapid Determination of Lymphogranuloma Venereum Serovars of Chlamydia trachomatis by Quantitative High-Resolution Melt Analysis (HRMA)

    Science.gov (United States)

    Stevens, Matthew P.; Garland, Suzanne M.; Zaia, Angelo M.; Tabrizi, Sepehr N.

    2012-01-01

    A quantitative high-resolution melt analysis assay was developed to differentiate lymphogranuloma venereum-causing serovars of Chlamydia trachomatis (L1 to L3) from other C. trachomatis serovars (D to K). The detection limit of this assay is approximately 10 copies per reaction, comparable to the limits of other quantitative-PCR-based methods. PMID:22933594

  13. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  14. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  15. An artificial neural network approach to laser-induced breakdown spectroscopy quantitative analysis

    International Nuclear Information System (INIS)

    D’Andrea, Eleonora; Pagnotta, Stefano; Grifoni, Emanuela; Lorenzetti, Giulia; Legnaioli, Stefano; Palleschi, Vincenzo; Lazzerini, Beatrice

    2014-01-01

    The usual approach to laser-induced breakdown spectroscopy (LIBS) quantitative analysis is based on the use of calibration curves, suitably built using appropriate reference standards. More recently, statistical methods relying on the principles of artificial neural networks (ANN) are increasingly used. However, ANN analysis is often used as a ‘black box’ system and the peculiarities of the LIBS spectra are not exploited fully. An a priori exploration of the raw data contained in the LIBS spectra, carried out by a neural network to learn what are the significant areas of the spectrum to be used for a subsequent neural network delegated to the calibration, is able to throw light upon important information initially unknown, although already contained within the spectrum. This communication will demonstrate that an approach based on neural networks specially taylored for dealing with LIBS spectra would provide a viable, fast and robust method for LIBS quantitative analysis. This would allow the use of a relatively limited number of reference samples for the training of the network, with respect to the current approaches, and provide a fully automatizable approach for the analysis of a large number of samples. - Highlights: • A methodological approach to neural network analysis of LIBS spectra is proposed. • The architecture of the network and the number of inputs are optimized. • The method is tested on bronze samples already analyzed using a calibration-free LIBS approach. • The results are validated, compared and discussed

  16. Quantitative analysis of contrast-enhanced ultrasonography of the bowel wall can predict disease activity in inflammatory bowel disease

    Energy Technology Data Exchange (ETDEWEB)

    Romanini, Laura, E-mail: laura.romanini@libero.it [Department of Radiology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Passamonti, Matteo, E-mail: matteopassamonti@gmail.com [Department of Radiology-AO Provincia di Lodi, Via Fissiraga, 15, 26900 Lodi (Italy); Navarria, Mario, E-mail: navarria.mario@tiscali.it [Department of Radiology-ASL Vallecamonica-Sebino, Via Manzoni 142, 25040 Esine, BS (Italy); Lanzarotto, Francesco, E-mail: francesco.lanzarotto@spedalicivili.brescia.it [Department of Gastroenterology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Villanacci, Vincenzo, E-mail: villanac@alice.it [Department of Pathology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Grazioli, Luigi, E-mail: radiologia1@spedalicivili.brescia.it [Department of Radiology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Calliada, Fabrizio, E-mail: fabrizio.calliada@gmail.com [Department of Radiology, University of Pavia, Viale Camillo Golgi 19, 27100 Pavia (Italy); Maroldi, Roberto, E-mail: rmaroldi@gmail.com [Department of Radiology, University of Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy)

    2014-08-15

    Purpose: To evaluate the accuracy of quantitative analysis of bowel wall enhancement in inflammatory bowel disease (IBD) with contrast enhanced ultrasound (CEUS) by comparing the results with vascular density in a biopsy sample from the same area of the intestinal tract, and to determine the usefulness of this analysis for the prediction of disease activity. Materials and methods: This prospective study was approved by our institute's ethics committee and all patients gave written informed consent. We enrolled 33 consecutive adult patients undergoing colonoscopy and biopsy for IBD. All patients underwent CEUS and the results were quantitatively analyzed. Vessel count per high-power field on biopsy specimens was compared with colonoscopy, baseline ultrasonography, and CEUS findings, and with analysis of peak intensity, time to peak, regional blood volume, mean transit time, and regional blood flow. Results in patients with high and low vascular density were compared using Fisher's test, t-test, Pearson's correlation test, and receiver operating characteristic curve (ROC) analysis. Cutoff values were determined using ROC analysis, and sensitivity and specificity were calculated. Results: High vascular density (>265 vessels per field) on histological examination was significantly correlated with active disease on colonoscopy, baseline ultrasonography, and CEUS (p < .0001). Quantitative analysis showed a higher enhancement peak, a shorter time to peak enhancement, a higher regional blood flow and regional blood volume in patients with high vascular density than in those with low vascular density. Cutoff values to distinguish between active and inactive disease were identified for peak enhancement (>40.5%), and regional blood flow (>54.8 ml/min). Conclusion: Quantitative analysis of CEUS data correlates with disease activity as determined by vascular density. Quantitative parameters of CEUS can be used to predict active disease with high sensitivity and

  17. Quantitative analysis of contrast-enhanced ultrasonography of the bowel wall can predict disease activity in inflammatory bowel disease

    International Nuclear Information System (INIS)

    Romanini, Laura; Passamonti, Matteo; Navarria, Mario; Lanzarotto, Francesco; Villanacci, Vincenzo; Grazioli, Luigi; Calliada, Fabrizio; Maroldi, Roberto

    2014-01-01

    Purpose: To evaluate the accuracy of quantitative analysis of bowel wall enhancement in inflammatory bowel disease (IBD) with contrast enhanced ultrasound (CEUS) by comparing the results with vascular density in a biopsy sample from the same area of the intestinal tract, and to determine the usefulness of this analysis for the prediction of disease activity. Materials and methods: This prospective study was approved by our institute's ethics committee and all patients gave written informed consent. We enrolled 33 consecutive adult patients undergoing colonoscopy and biopsy for IBD. All patients underwent CEUS and the results were quantitatively analyzed. Vessel count per high-power field on biopsy specimens was compared with colonoscopy, baseline ultrasonography, and CEUS findings, and with analysis of peak intensity, time to peak, regional blood volume, mean transit time, and regional blood flow. Results in patients with high and low vascular density were compared using Fisher's test, t-test, Pearson's correlation test, and receiver operating characteristic curve (ROC) analysis. Cutoff values were determined using ROC analysis, and sensitivity and specificity were calculated. Results: High vascular density (>265 vessels per field) on histological examination was significantly correlated with active disease on colonoscopy, baseline ultrasonography, and CEUS (p < .0001). Quantitative analysis showed a higher enhancement peak, a shorter time to peak enhancement, a higher regional blood flow and regional blood volume in patients with high vascular density than in those with low vascular density. Cutoff values to distinguish between active and inactive disease were identified for peak enhancement (>40.5%), and regional blood flow (>54.8 ml/min). Conclusion: Quantitative analysis of CEUS data correlates with disease activity as determined by vascular density. Quantitative parameters of CEUS can be used to predict active disease with high sensitivity and

  18. Quantitative analysis of surface characteristics and morphology in Death Valley, California using AIRSAR data

    Science.gov (United States)

    Kierein-Young, K. S.; Kruse, F. A.; Lefkoff, A. B.

    1992-01-01

    The Jet Propulsion Laboratory Airborne Synthetic Aperture Radar (JPL-AIRSAR) is used to collect full polarimetric measurements at P-, L-, and C-bands. These data are analyzed using the radar analysis and visualization environment (RAVEN). The AIRSAR data are calibrated using in-scene corner reflectors to allow for quantitative analysis of the radar backscatter. RAVEN is used to extract surface characteristics. Inversion models are used to calculate quantitative surface roughness values and fractal dimensions. These values are used to generate synthetic surface plots that represent the small-scale surface structure of areas in Death Valley. These procedures are applied to a playa, smooth salt-pan, and alluvial fan surfaces in Death Valley. Field measurements of surface roughness are used to verify the accuracy.

  19. Theory of quantitative trend analysis and its application to the South African elections

    CSIR Research Space (South Africa)

    Greben, JM

    2006-02-28

    Full Text Available In this paper the author discusses a quantitative theory of trend analysis. Often trends are based on qualitative considerations and subjective assumptions. In the current approach the author makes use of extensive data bases to optimise the so...

  20. Quantitative trace element analysis of individual fly ash particles by means of X-ray microfluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Vincze, L.; Somogyi, A.; Osan, J.; Vekemans, B.; Torok, S.; Janssens, K.; Adams, F. [Universitaire of Instelling Antwerp, Wilrijk (Belgium). Dept. of Chemistry

    2002-07-01

    A new quantification procedure was developed for the evaluation of X-ray microfluorescence (XRF) data sets obtained from individual particles, based on iterative Monte Carlo (MC) simulation. Combined with the high sensitivity of synchrotron radiation-induced XRF spectroscopy, the method was used to obtain quantitative information down to trace-level concentrations from micrometer-sized particulate matter. The detailed XRF simulation model was validated by comparison of calculated and experimental XRF spectra obtained for glass microsphere standards, resulting in uncertainties in the range of 3-10% for the calculated elemental sensitivities. The simulation model was applied for the quantitative analysis of X-ray tube and synchrotron radiation-induced scanning micro-XRF spectra of individual coal and wood fly ash particles originating from different Hungarian power plants. By measuring the same particles by both methods the major, minor, and trace element compositions of the particles were determined. The uncertainty of the MC based quantitative analysis scheme is estimated to be in the range of 5-30%.

  1. Stochastic resonance is applied to quantitative analysis for weak chromatographic signal of glyburide in plasma

    International Nuclear Information System (INIS)

    Zhang Wei; Xiang Bingren; Wu Yanwei; Shang Erxin

    2005-01-01

    Based on the theory of stochastic resonance, a new method carried on the quantitive analysis to weak chromatographic signal of glyburide in plasma, which was embedded in the noise background and the signal-to-noise ratio (SNR) of HPLC-UV is enhanced remarkably. This method enhances the quantification limit to 1 ng ml -1 , which is the same as HPLC-MS, and makes it possible to detect the weak signal accurately by HPLC-UV, which was not suitable before. The results showed good recovery and linear range from 1 to 50 ng ml -1 of glyburide in plasma and the method can be used for quantitative analysis of glyburide

  2. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2001-04-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with {sup 99m}Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or {sup 99m}Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  3. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu

    2001-01-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99m Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99m Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  4. A quantitative spatiotemporal analysis of microglia morphology during ischemic stroke and reperfusion

    Directory of Open Access Journals (Sweden)

    Morrison Helena W

    2013-01-01

    Full Text Available Abstract Background Microglia cells continuously survey the healthy brain in a ramified morphology and, in response to injury, undergo progressive morphological and functional changes that encompass microglia activation. Although ideally positioned for immediate response to ischemic stroke (IS and reperfusion, their progressive morphological transformation into activated cells has not been quantified. In addition, it is not well understood if diverse microglia morphologies correlate to diverse microglia functions. As such, the dichotomous nature of these cells continues to confound our understanding of microglia-mediated injury after IS and reperfusion. The purpose of this study was to quantitatively characterize the spatiotemporal pattern of microglia morphology during the evolution of cerebral injury after IS and reperfusion. Methods Male C57Bl/6 mice were subjected to focal cerebral ischemia and periods of reperfusion (0, 8 and 24 h. The microglia process length/cell and number of endpoints/cell was quantified from immunofluorescent confocal images of brain regions using a skeleton analysis method developed for this study. Live cell morphology and process activity were measured from movies acquired in acute brain slices from GFP-CX3CR1 transgenic mice after IS and 24-h reperfusion. Regional CD11b and iNOS expressions were measured from confocal images and Western blot, respectively, to assess microglia proinflammatory function. Results Quantitative analysis reveals a significant spatiotemporal relationship between microglia morphology and evolving cerebral injury in the ipsilateral hemisphere after IS and reperfusion. Microglia were both hyper- and de-ramified in striatal and cortical brain regions (respectively after 60 min of focal cerebral ischemia. However, a de-ramified morphology was prominent when ischemia was coupled to reperfusion. Live microglia were de-ramified, and, in addition, process activity was severely blunted proximal to

  5. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  6. Simultaneous quantitative analysis of main components in linderae reflexae radix with one single marker.

    Science.gov (United States)

    Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing

    2016-05-08

    Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.

  7. Quantitative Analysis of Adulterations in Oat Flour by FT-NIR Spectroscopy, Incomplete Unbalanced Randomized Block Design, and Partial Least Squares

    Directory of Open Access Journals (Sweden)

    Ning Wang

    2014-01-01

    Full Text Available This paper developed a rapid and nondestructive method for quantitative analysis of a cheaper adulterant (wheat flour in oat flour by NIR spectroscopy and chemometrics. Reflectance FT-NIR spectra in the range of 4000 to 12000 cm−1 of 300 oat flour objects adulterated with wheat flour were measured. The doping levels of wheat flour ranged from 5% to 50% (w/w. To ensure the generalization performance of the method, both the oat and the wheat flour samples were collected from different producing areas and an incomplete unbalanced randomized block (IURB design was performed to include the significant variations that may be encountered in future samples. Partial least squares regression (PLSR was used to develop calibration models for predicting the levels of wheat flour. Different preprocessing methods including smoothing, taking second-order derivative (D2, and standard normal variate (SNV transformation were investigated to improve the model accuracy of PLS. The root mean squared error of Monte Carlo cross-validation (RMSEMCCV and root mean squared error of prediction (RMSEP were 1.921 and 1.975 (%, w/w by D2-PLS, respectively. The results indicate that NIR and chemometrics can provide a rapid method for quantitative analysis of wheat flour in oat flour.

  8. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  9. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    Science.gov (United States)

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  10. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  11. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    Science.gov (United States)

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle

  12. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    Science.gov (United States)

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  13. An improved UPLC-MS/MS platform for quantitative analysis of glycerophosphoinositol in mammalian cells.

    Directory of Open Access Journals (Sweden)

    Laura Grauso

    Full Text Available The glycerophosphoinositols constitute a class of biologically active lipid-derived mediators whose intracellular levels are modulated during physiological and pathological cell processes. Comprehensive assessment of the role of these compounds expands beyond the cellular biology of lipids and includes rapid and unambiguous measurement in cells and tissues. Here we describe a sensitive and simple liquid chromatography-tandem mass spectrometry (LC-MS/MS method for quantitative analysis of the most abundant among these phosphoinositide derivatives in mammalian cells, the glycerophosphoinositol (GroPIns. The method has been developed in mouse Raw 264.7 macrophages with limits of quantitation at 3 ng/ml. Validation on the same cell line showed excellent response in terms of linear dynamic range (from 3 to 3,000 ng/ml, intra-day and inter-day precision (coefficient of variation ≤ 7.10% and accuracy (between 98.1 and 109.0% in the range 10-320 ng/ml. As proof of concept, a simplified analytical platform based on this method and external calibration was also tested on four stimulated and unstimulated cell lines, including Raw 264.7 macrophages, Jurkat T-cells, A375MM melanoma cells and rat basophilic leukemia RBL-2H3 cells. The results indicate a wide variation in GroPIns levels among different cell lines and stimulation conditions, although the measurements were always in line with the literature. No significant matrix effects were observed thus indicating that the here proposed method can be of general use for similar determinations in cells of different origin.

  14. Quantitative image analysis for investigating cell-matrix interactions

    Science.gov (United States)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  15. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    Science.gov (United States)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or

  16. Balancing the Quantitative and Qualitative Aspects of Social Network Analysis to Study Complex Social Systems

    OpenAIRE

    Schipper, Danny; Spekkink, Wouter

    2015-01-01

    Social Network Analysis (SNA) can be used to investigate complex social systems. SNA is typically applied as a quantitative method, which has important limitations. First, quantitative methods are capable of capturing the form of relationships (e.g. strength and frequency), but they are less suitable for capturing the content of relationships (e.g. interests and motivations). Second, while complex social systems are highly dynamic, the representations that SNA creates of such systems are ofte...

  17. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    NARCIS (Netherlands)

    Bosschaart, Nienke; van Leeuwen, Ton; Aalders, Maurice C.G.; Faber, Dirk

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of

  18. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    Science.gov (United States)

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  19. A Quantitative Analysis of Pulsed Signals Emitted by Wild Bottlenose Dolphins.

    Directory of Open Access Journals (Sweden)

    Ana Rita Luís

    Full Text Available Common bottlenose dolphins (Tursiops truncatus, produce a wide variety of vocal emissions for communication and echolocation, of which the pulsed repertoire has been the most difficult to categorize. Packets of high repetition, broadband pulses are still largely reported under a general designation of burst-pulses, and traditional attempts to classify these emissions rely mainly in their aural characteristics and in graphical aspects of spectrograms. Here, we present a quantitative analysis of pulsed signals emitted by wild bottlenose dolphins, in the Sado estuary, Portugal (2011-2014, and test the reliability of a traditional classification approach. Acoustic parameters (minimum frequency, maximum frequency, peak frequency, duration, repetition rate and inter-click-interval were extracted from 930 pulsed signals, previously categorized using a traditional approach. Discriminant function analysis revealed a high reliability of the traditional classification approach (93.5% of pulsed signals were consistently assigned to their aurally based categories. According to the discriminant function analysis (Wilk's Λ = 0.11, F3, 2.41 = 282.75, P < 0.001, repetition rate is the feature that best enables the discrimination of different pulsed signals (structure coefficient = 0.98. Classification using hierarchical cluster analysis led to a similar categorization pattern: two main signal types with distinct magnitudes of repetition rate were clustered into five groups. The pulsed signals, here described, present significant differences in their time-frequency features, especially repetition rate (P < 0.001, inter-click-interval (P < 0.001 and duration (P < 0.001. We document the occurrence of a distinct signal type-short burst-pulses, and highlight the existence of a diverse repertoire of pulsed vocalizations emitted in graded sequences. The use of quantitative analysis of pulsed signals is essential to improve classifications and to better assess the

  20. Quantitative analysis of Si1-xGex alloy films by SIMS and XPS depth profiling using a reference material

    Science.gov (United States)

    Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong

    2018-02-01

    Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.

  1. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Ling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); Smuts, Jonathan; Walsh, Phillip [VUV Analytics, Inc., Cedar Park, TX (United States); Qiu, Changling [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); McNair, Harold M. [Department of Chemistry, Virginia Tech, Blacksburg, VA (United States); Schug, Kevin A., E-mail: kschug@uta.edu [Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States)

    2017-02-08

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  2. Pseudo-absolute quantitative analysis using gas chromatography – Vacuum ultraviolet spectroscopy – A tutorial

    International Nuclear Information System (INIS)

    Bai, Ling; Smuts, Jonathan; Walsh, Phillip; Qiu, Changling; McNair, Harold M.; Schug, Kevin A.

    2017-01-01

    The vacuum ultraviolet detector (VUV) is a new non-destructive mass sensitive detector for gas chromatography that continuously and rapidly collects full wavelength range absorption between 120 and 240 nm. In addition to conventional methods of quantification (internal and external standard), gas chromatography - vacuum ultraviolet spectroscopy has the potential for pseudo-absolute quantification of analytes based on pre-recorded cross sections (well-defined absorptivity across the 120–240 nm wavelength range recorded by the detector) without the need for traditional calibration. The pseudo-absolute method was used in this research to experimentally evaluate the sources of sample loss and gain associated with sample introduction into a typical gas chromatograph. Standard samples of benzene and natural gas were used to assess precision and accuracy for the analysis of liquid and gaseous samples, respectively, based on the amount of analyte loaded on-column. Results indicate that injection volume, split ratio, and sampling times for splitless analysis can all contribute to inaccurate, yet precise sample introduction. For instance, an autosampler can very reproducibly inject a designated volume, but there are significant systematic errors (here, a consistently larger volume than that designated) in the actual volume introduced. The pseudo-absolute quantification capability of the vacuum ultraviolet detector provides a new means for carrying out system performance checks and potentially for solving challenging quantitative analytical problems. For practical purposes, an internal standardized approach to normalize systematic errors can be used to perform quantitative analysis with the pseudo-absolute method. - Highlights: • Gas chromatography diagnostics and quantification using VUV detector. • Absorption cross-sections for molecules enable pseudo-absolute quantitation. • Injection diagnostics reveal systematic errors in hardware settings. • Internal

  3. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    Science.gov (United States)

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    Science.gov (United States)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  5. Quantitative and qualitative analysis of the expert and non-expert opinion in fire risk in buildings

    International Nuclear Information System (INIS)

    Hanea, D.M.; Jagtman, H.M.; Alphen, L.L.M.M. van; Ale, B.J.M.

    2010-01-01

    Expert judgment procedure is a method very often used in the area of risk assessments of complex systems or processes to fill in quantitative data. Although it has been proved to be a very reliable source of information when no other data are available, the choice of experts is always questioned. When the available data are limited, the seed questions cover only partially the domains of expertise, which may cause problems. Expertise is assessed not covering the full object of study but only those topics for which seed questions can be formulated. The commonly used quantitative analysis of an expert judgment exercise is combined with a qualitative analysis. The latter adds more insights to the relation between the assessor's field and statistical knowledge and their performance in an expert judgment. In addition the qualitative analysis identifies different types of seed questions. Three groups of assessors with different levels of statistical and domain knowledge are studied. The quantitative analysis shows no differences between field experts and non-experts and no differences between having advanced statistical knowledge or not. The qualitative analysis supports these findings. In addition it is found that especially technical questions are answered with larger intervals. Precaution is required when using seed questions for which the real value can be calculated, which was the case for one of the seed questions.

  6. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Criminal Justice Systems in Europe. A cross-national quantitative analysis

    OpenAIRE

    Becerra-Muñoz, Jose; García-España, Elisa; Aguilar Conde, Araceli

    2013-01-01

    In the last years, the Crime Observatory of the University of Malaga has analysed police records on criminal activity, has also carried out several crime victims surveys in Spain and has worked on a detailed analysis of the prison system and its connection to the prison policy. This year´s report focuses on the Criminal Justice System, one of the big official data providers, to gather, organize and interpret a great deal of quantitative data from 2000 to 2011. Such longitudinal scrutiny of...

  9. Improved noninvasive assessment of coronary artery disease by quantitative analysis of regional stress myocardial distribution and washout of thallium-201

    International Nuclear Information System (INIS)

    Maddahi, J.; Garcia, E.V.; Berman, D.S.; Waxman, A.; Swan, H.J.C.; Forrester, J.

    1981-01-01

    Visual interpretation of stress-redistribution thallium-201 ( 201 Tl) scintigrams is subject to observer variability and is suboptimal for evaluation of extent of coronary artery disease (CAD). An objective, computerized technique has been developed that quantitatively expresses the relative space-time myocardial distribution of 201 Tl. Multiple-view, maximum-count circumferential profiles for stress myocardial distribution of 201 Tl and segmental percent washout were analyzed in a pilot group of 31 normal subjects and 20 patients with CAD to develop quantitative criteria for abnormality. Subsequently, quantitative analysis was applied prospectively to a group of 22 normal subjects and 45 CAD patients and compared with visual interpretation of scintigrams for detection and evaluation of CAD. The sensitivity and specificity of the quantitative technique (93% and 91%, respectively) were not significantly different from those of the visual method (91% and 86%). The quantitative analysis significantly (p 201 Tl imaging over the visual method in the left anterior descending artery (from 56% to 80%), left circumflex artery (from 34% to 63%) and right coronary artery (from 65% to 94%) without significant loss of specificity. Using quantitative analysis, sensitivity for detection of deseased vessels did not diminish as the number of vessels involved increased, as it did with visual interpretations. In patients with one-vessel disease, 86% of the lesions were detected by both techniques; however, in patients with three-vessel disease, quantitative analysis detected 83% of the lesions, while the sensitivity was only 53% for the visual method. Seventy percent of the coronary arteries with moderate

  10. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    Science.gov (United States)

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  11. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    Science.gov (United States)

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (Panalysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  12. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    Science.gov (United States)

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  14. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Wu

    Full Text Available ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  15. Quantitative risk analysis offshore-Human and organizational factors

    International Nuclear Information System (INIS)

    Espen Skogdalen, Jon; Vinnem, Jan Erik

    2011-01-01

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  16. 40 CFR 60.1125 - What must I include in my siting analysis?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What must I include in my siting... § 60.1125 What must I include in my siting analysis? (a) Include an analysis of how your municipal...) Vegetation. (b) Include an analysis of alternatives for controlling air pollution that minimize potential...

  17. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  18. Particle induced X-ray emission for quantitative trace-element analysis using the Eindhoven cyclotron

    International Nuclear Information System (INIS)

    Kivits, H.

    1980-01-01

    Development of a multi-elemental trace analysis technique using PIXE (Particle Induced X-ray Emission), was started almost five years ago at the Eindhoven University of Technology, in the Cyclotron Applications Group of the Physics Department. The aim of the work presented is to improve the quantitative aspects of trace-element analysis with PIXE, as well as versatility, speed and simplicity. (Auth.)

  19. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  20. Advantages of a Dynamic RGGG Method in Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2009-01-01

    Various researches have been conducted in order to analyze dynamic interactions among components and process variables in nuclear power plants which cannot be handled by static reliability analysis methods such as conventional fault tree and event tree techniques. A dynamic reliability graph with general gates (RGGG) method was proposed for an intuitive modeling of dynamic systems and it enables one to easily analyze huge and complex systems. In this paper, advantages of the dynamic RGGG method are assessed through two stages: system modeling and quantitative analysis. And then a software tool for dynamic RGGG method is introduced and an application to a real dynamic system is accompanied

  1. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  2. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  3. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  4. A Quantitative Acetylomic Analysis of Early Seed Development in Rice (Oryza sativa L.).

    Science.gov (United States)

    Wang, Yifeng; Hou, Yuxuan; Qiu, Jiehua; Li, Zhiyong; Zhao, Juan; Tong, Xiaohong; Zhang, Jian

    2017-06-27

    PKA (protein lysine acetylation) is a critical post-translational modification that regulates various developmental processes, including seed development. However, the acetylation events and dynamics on a proteomic scale in this process remain largely unknown, especially in rice early seed development. We report the first quantitative acetylproteomic study focused on rice early seed development by employing a mass spectral-based (MS-based), label-free approach. A total of 1817 acetylsites on 1688 acetylpeptides from 972 acetylproteins were identified in pistils and seeds at three and seven days after pollination, including 268 acetyproteins differentially acetylated among the three stages. Motif-X analysis revealed that six significantly enriched motifs, such as (DxkK), (kH) and (kY) around the acetylsites of the identified rice seed acetylproteins. Differentially acetylated proteins among the three stages, including adenosine diphosphate (ADP) -glucose pyrophosphorylases (AGPs), PDIL1-1 (protein disulfide isomerase like 1-1), hexokinases, pyruvate dehydrogenase complex (PDC) and numerous other regulators that are extensively involved in the starch and sucrose metabolism, glycolysis/gluconeogenesis, tricarboxylic acid (TCA) cycle and photosynthesis pathways during early seed development. This study greatly expanded the rice acetylome dataset, and shed novel insight into the regulatory roles of PKA in rice early seed development.

  5. Excitation wavelength selection for quantitative analysis of carotenoids in tomatoes using Raman spectroscopy.

    Science.gov (United States)

    Hara, Risa; Ishigaki, Mika; Kitahama, Yasutaka; Ozaki, Yukihiro; Genkawa, Takuma

    2018-08-30

    The difference in Raman spectra for different excitation wavelengths (532 nm, 785 nm, and 1064 nm) was investigated to identify an appropriate wavelength for the quantitative analysis of carotenoids in tomatoes. For the 532 nm-excited Raman spectra, the intensity of the peak assigned to the carotenoid has no correlation with carotenoid concentration, and the peak shift reflects carotenoid composition changing from lycopene to β-carotene and lutein. Thus, 532 nm-excited Raman spectra are useful for the qualitative analysis of carotenoids. For the 785 nm- and 1064 nm-excited Raman spectra, the peak intensity of the carotenoid showed good correlation with carotenoid concentration; thus, regression models for carotenoid concentration were developed using these Raman spectra and partial least squares regression. A regression model designed using the 785 nm-excited Raman spectra showed a better result than the 532 nm- and 1064 nm-excited Raman spectra. Therefore, it can be concluded that 785 nm is the most suitable excitation wavelength for the quantitative analysis of carotenoid concentration in tomatoes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  7. Quantitative analysis of ground penetrating radar data in the Mu Us Sandland

    Science.gov (United States)

    Fu, Tianyang; Tan, Lihua; Wu, Yongqiu; Wen, Yanglei; Li, Dawei; Duan, Jinlong

    2018-06-01

    Ground penetrating radar (GPR), which can reveal the sedimentary structure and development process of dunes, is widely used to evaluate aeolian landforms. The interpretations for GPR profiles are mostly based on qualitative descriptions of geometric features of the radar reflections. This research quantitatively analyzed the waveform parameter characteristics of different radar units by extracting the amplitude and time interval parameters of GPR data in the Mu Us Sandland in China, and then identified and interpreted different sedimentary structures. The results showed that different types of radar units had specific waveform parameter characteristics. The main waveform parameter characteristics of sand dune radar facies and sandstone radar facies included low amplitudes and wide ranges of time intervals, ranging from 0 to 0.25 and 4 to 33 ns respectively, and the mean amplitudes changed gradually with time intervals. The amplitude distribution curves of various sand dune radar facies were similar as unimodal distributions. The radar surfaces showed high amplitudes with time intervals concentrated in high-value areas, ranging from 0.08 to 0.61 and 9 to 34 ns respectively, and the mean amplitudes changed drastically with time intervals. The amplitude and time interval values of lacustrine radar facies were between that of sand dune radar facies and radar surfaces, ranging from 0.08 to 0.29 and 11 to 30 ns respectively, and the mean amplitude and time interval curve was approximately trapezoidal. The quantitative extraction and analysis of GPR reflections could help distinguish various radar units and provide evidence for identifying sedimentary structure in aeolian landforms.

  8. Simultaneous HPLC quantitative analysis of mangostin derivatives in Tetragonula pagdeni propolis extracts

    Directory of Open Access Journals (Sweden)

    Sumet Kongkiatpaiboon

    2016-04-01

    Full Text Available Propolis has been used as indigenous medicine for curing numerous maladies. The one that is of ethnopharmacological use is stingless bee propolis from Tetragonula pagdeni. A simultaneous high-performance liquid chromatography (HPLC investigation was developed and validated to determine the contents of bioactive compounds: 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin. HPLC analysis was effectively performed using a Hypersil BDS C18 column, with the gradient elution of methanol–0.2% formic acid and a flow rate of 1 ml/min, at 25 °C and detected at 245 nm. Parameters for the validation included accuracy, precision, linearity, and limits of quantitation and detection. The developed HPLC technique was precise, with lower than 2% relative standard deviation. The recovery values of 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin in the extracts were 99.98%, 99.97%, 98.98% and 99.19%, respectively. The average contents of these mixtures in the propolis extracts collected from different seasons were 0.127%, 1.008%, 0.323% and 2.703% (w/w, respectively. The developed HPLC technique was suitable and practical for the simultaneous analysis of these mangostin derivatives in T. pagdeni propolis and would be a valuable guidance for the standardization of its pharmaceutical products.

  9. Quantitative analysis of abused drugs in physiological fluids by gas chromatography/chemical ionization mass spectrometry

    International Nuclear Information System (INIS)

    Foltz, R.L.

    1978-01-01

    Methods have been developed for quantitative analysis of commonly abused drugs in physiological fluids using gas chromatography/chemical ionization mass spectrometry. The methods are being evaluated in volunteer analytical and toxicological laboratories, and analytical manuals describing the methods are being prepared. The specific drug and metabolites included in this program are: Δ 9 -tetrahydrocannabinol, methadone, phencyclidine, methaqualone, morphine, amphetamine, methamphetamine, mescaline, 2,5-dimethoxy-4-methyl amphetamine, cocaine, benzoylecgonine, diazepam, and N-desmethyldiazepam. The current analytical methods utilize relatively conventional instrumentation and procedures, and are capable of measuring drug concentrations as low as 1 ng/ml. Various newer techniques such as sample clean-up by high performance liquid chromatography, separation by glass capillary chromatography, and ionization by negative ion chemical ionization are being investigated with respect to their potential for achieving higher sensitivity and specificity, as well as their ability to facilitate simultaneous analysis of more than one drug and metabolite. (Auth.)

  10. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    Science.gov (United States)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  11. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  12. Verbal Rehearsal and Memory in Children with Closed Head Injury: A Quantitative and Qualitative Analysis.

    Science.gov (United States)

    Harris, Jessica R.

    1996-01-01

    Nine closed head injured (CHI) children (mean age 11 years) with post-onset intervals of 7 months to 8 years were given an overt free recall task. Quantitative analysis suggested inefficient passive rehearsal strategy by severely injured subjects. Qualitative analysis revealed differences between CHI children and controls in rehearsal strategies,…

  13. On the quantitative X-ray phase analysis of R-Co alloys

    International Nuclear Information System (INIS)

    Lyubushkin, V.A.; Lyubushkina, L.M.; Vetoshkin, I.D.

    1982-01-01

    Using the method of quantitative X-ray phase analysis two-phase (RCo 5 -R 2 Co 17 ) alloys Sm-Co and Pr-Co have been studied. The investigations are made using the DRON-2.0 dif,ractometer in filtrated FeKα-radiation. Calibration diagrams for model binary mixtures are built, their use is recommended for express-evaluation of the amount of the phase determined. Test of the technique suggested is carried out

  14. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  15. Quantitative MRI analysis of the brain after twenty-two years of neuromyelitis optica indicates focal tissue damage

    DEFF Research Database (Denmark)

    Aradi, Mihaly; Koszegi, Edit; Orsi, Gergely

    2013-01-01

    ). In such abnormal NAWM regions, biexponential diffusion analysis and quantitative spectroscopy indicated extracellular edema and axonal loss, respectively. Repeated analysis 6 months later identified the same alterations. Such patchy alterations were not detectable in the NAWM of the 3 cases with short-term NMO......BACKGROUND: The long-term effect of neuromyelitis optica (NMO) on the brain is not well established. METHODS: After 22 years of NMO, a patient's brain was examined by quantitative T1- and T2-weighted mono- and biexponential diffusion and proton spectroscopy. It was compared to 3 cases with short...

  16. Quantitative analysis of stress thallium-201 studies: comparison of SPET and planar imaging in the detection of CAD

    International Nuclear Information System (INIS)

    Ziada, G.; Hayat, N.; Abdel-Dayem, H.M.; Hassan, I.

    1986-01-01

    The value of thallium-201 tomographic sections in the detection of coronary artery disease is illustrated by comparing visual interpretation (VTS) and quantitative analysis (QTS) with visual planar study (VPS) and quantitative analysis of planar study (QPS), referring to coronary angiography (CA) as the standard technique. It is concluded that visual assessment of single photon emission tomography (VTS) is more valuable than all other techniques (VPS, QPS and QTS) for detecting and localizing coronary artery disease. (UK)

  17. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Theme trends and knowledge structure on choroidal neovascularization: a quantitative and co-word analysis.

    Science.gov (United States)

    Zhao, Fangkun; Shi, Bei; Liu, Ruixin; Zhou, Wenkai; Shi, Dong; Zhang, Jinsong

    2018-04-03

    The distribution pattern and knowledge structure of choroidal neovascularization (CNV) was surveyed based on literatures in PubMed. Published scientific papers about CNV were retrieved from Jan 1st, 2012 to May 31st, 2017. Extracted MeSH terms were analyzed quantitatively by using Bibliographic Item Co-Occurrence Matrix Builder (BICOMB) and high-frequency MeSH terms were identified. Hierarchical cluster analysis was conducted by SPSS 19.0 according to the MeSH term-source article matrix. High-frequency MeSH terms co-occurrence matrix was constructed to support strategic diagram and social network analysis (SNA). According to the searching strategy, all together 2366 papers were included, and the number of annual papers changed slightly from Jan 1st, 2012 to May 31st, 2017. Among all the extracted MeSH terms, 44 high-frequency MeSH terms were identified and hotspots were clustered into 6 categories. In the strategic diagram, clinical drug therapy, pathology and diagnosis related researches of CNV were well developed. In contrast, the metabolism, etiology, complications, prevention and control of CNV in animal models, and genetics related researches of CNV were relatively immature, which offers potential research space for future study. As for the SNA result, the position status of each component was described by the centrality values. The studies on CNV are relatively divergent and the 6 research categories concluded from this study could reflect the publication trends on CNV to some extent. By providing a quantitative bibliometric research across a 5-year span, it could help to depict an overall command of the latest topics and provide some hints for researchers when launching new projects.

  19. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    Science.gov (United States)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  20. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  1. Quantitative Analysis of Microbes in Water Tank of G.A. Siwabessy Reactor

    International Nuclear Information System (INIS)

    Itjeu Karliana; Diah Dwiana Lestiani

    2003-01-01

    The quality of water in reactor system has an important role because it could effect the function as a coolant and the operation of reactor indirectly. The study of microbe analyzes has been carried out to detect the existence of microbes in water tank and quantitative analyzes of microbes also has been applied as a continuation of the previous study. The samples is taken out from the end side of reactor GA Siwabessy's tank, inoculated in TSA (Tripcase Soy Agar) medium, put in incubator at 30 - 35 o C for 4 days. The results of experiment show the reconfirmation for the existence of bacteria and the un-existence of yield. The quantitative analysis with TPC method show the growth rate of bacteria is twice in 24 hours. (author)

  2. Quantitative genotoxicity assays for analysis of medicinal plants: A systematic review.

    Science.gov (United States)

    Sponchiado, Graziela; Adam, Mônica Lucia; Silva, Caroline Dadalt; Soley, Bruna Silva; de Mello-Sampayo, Cristina; Cabrini, Daniela Almeida; Correr, Cassyano Januário; Otuki, Michel Fleith

    2016-02-03

    Medicinal plants are known to contain numerous biologically active compounds, and although they have proven pharmacological properties, they can cause harm, including DNA damage. Review the literature to evaluate the genotoxicity risk of medicinal plants, explore the genotoxicity assays most used and compare these to the current legal requirements. A quantitative systematic review of the literature, using the keywords "medicinal plants", "genotoxicity" and "mutagenicity", was undertakenQ to identify the types of assays most used to assess genotoxicity, and to evaluate the genotoxicity potential of medicinal plant extracts. The database searches retrieved 2289 records, 458 of which met the inclusion criteria. Evaluation of the selected articles showed a total of 24 different assays used for an assessment of medicinal plant extract genotoxicity. More than a quarter of those studies (28.4%) reported positive results for genotoxicity. This review demonstrates that a range of genotoxicity assay methods are used to evaluate the genotoxicity potential of medicinal plant extracts. The most used methods are those recommended by regulatory agencies. However, based on the current findings, in order to conduct a thorough study concerning the possible genotoxic effects of a medicinal plant, we indicate that it is important always to include bacterial and mammalian tests, with at least one in vivo assay. Also, these tests should be capable of detecting outcomes that include mutation induction, clastogenic and aneugenic effects, and structural chromosome abnormalities. In addition, the considerable rate of positive results detected in this analysis further supports the relevance of assessing the genotoxicity potential of medicinal plants. Copyright © 2016. Published by Elsevier Ireland Ltd.

  3. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    Science.gov (United States)

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  4. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  5. Simultaneous quantitative analysis of metabolites using ion-pair liquid chromatography-electrospray ionization mass spectrometry

    NARCIS (Netherlands)

    Coulier, L.; Bas, R.; Jespersen, S.; Verheij, E.; Werf, M.J. van der; Hankemeier, T.

    2006-01-01

    We have developed an analytical method, consisting of ion-pair liquid chromatography coupled to electrospray ionization mass spectrometry (IP-LC-ESI-MS), for the simultaneous quantitative analysis of several key classes of polar metabolites, like nucleotides, coenzyme A esters, sugar nucleotides,

  6. Addressing the Challenges Related to Transforming Qualitative Into Quantitative Data in Qualitative Comparative Analysis

    NARCIS (Netherlands)

    Block, de Debora; Vis, Barbara

    2018-01-01

    The use of qualitative data has so far received relatively little attention in methodological discussions on qualitative comparative analysis (QCA). This article addresses this lacuna by discussing the challenges researchers face when transforming qualitative data into quantitative data in QCA. By

  7. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    Science.gov (United States)

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  8. Relating interesting quantitative time series patterns with text events and text features

    Science.gov (United States)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other

  9. Quantitative analysis of occluded gases in uranium dioxide pellets by the mass spectrometry technique

    International Nuclear Information System (INIS)

    Vega Bustillos, J.O.W.; Rodrigues, C.; Iyer, S.S.

    1981-05-01

    A quantitative analysis of different components of occluded gases except water in uranium dioxide pellets is attempted here. A high temperature vacuum extration system is employed for the liberation and the determination of total volume of the occluded gases. A mass spectrometric technique is employed for the qualitative and quantitative analysis of these gases. The UO 2 pellets are placed in a graphite crucible and are subjected to varing temperatures (1000 0 C - 1700 0 C). The liberated gases are dehydrated and transferred to a measuring unit consisting essentially of a Toepler pump and a McLeod gauge. In this system the total volume of the gases liberated at N. T. P. is determined with a sensitivity of 0.002 cm 3 /g of UO 2 . An aliquot of the liberated gas is introduced into a quadrupole mass spectrometer (VGA-100 Varian Corp.) for the determination of the different components of the gas. On the basis of the analysis suggestions are made for the possible sources of these gas components. (Author) [pt

  10. Quantitative analysis of semivolatile organic compounds in selected fractions of air sample extracts by GC/MI-IR spectrometry

    International Nuclear Information System (INIS)

    Childers, J.W.; Wilson, N.K.; Barbour, R.K.

    1990-01-01

    The authors are currently investigating the capabilities of gas chromatography/matrix isolation infrared (GC/MI-IR) spectrometry for the determination of semivolatile organic compounds (SVOCs) in environmental air sample extracts. Their efforts are focused on the determination of SVOCs such as alkylbenzene positional isomers, which are difficult to separate chromatographically and to distinguish by conventional electron-impact ionization GC/mass spectrometry. They have performed a series of systematic experiments to identify sources of error in quantitative GC/MI-IR analyses. These experiments were designed to distinguish between errors due to instrument design or performance and errors that arise from some characteristic inherent to the GC/MI-IR technique, such as matrix effects. They have investigated repeatability as a function of several aspects of GC/MI IR spectrometry, including sample injection, spectral acquisition, cryogenic disk movement, and matrix deposition. The precision, linearity, dynamic range, and detection limits of a commercial GC/MI-IR system for target SVOCs were determined and compared to those obtained with the system's flame ionization detector. The use of deuterated internal standards in the quantitative GC/MI-IR analysis of selected fractions of ambient air sample extracts will be demonstrated. They will also discuss the current limitations of the technique in quantitative analyses and suggest improvements for future consideration

  11. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  12. Quantitative strain analysis of surfaces and interfaces using extremely asymmetric x-ray diffraction

    International Nuclear Information System (INIS)

    Akimoto, Koichi; Emoto, Takashi

    2010-01-01

    Strain can reduce carrier mobility and the reliability of electronic devices and affect the growth mode of thin films and the stability of nanometer-scale crystals. To control lattice strain, a technique for measuring the minute lattice strain at surfaces and interfaces is needed. Recently, an extremely asymmetric x-ray diffraction method has been developed for this purpose. By employing Darwin's dynamical x-ray diffraction theory, quantitative evaluation of strain at surfaces and interfaces becomes possible. In this paper, we review our quantitative strain analysis studies on native SiO 2 /Si interfaces, reconstructed Si surfaces, Ni/Si(111)-H interfaces, sputtered III-V compound semiconductor surfaces, high-k/Si interfaces, and Au ion-implanted Si. (topical review)

  13. Quantitative quenching evaluation and direct intracellular metabolite analysis in Penicillium chrysogenum.

    Science.gov (United States)

    Meinert, Sabine; Rapp, Sina; Schmitz, Katja; Noack, Stephan; Kornfeld, Georg; Hardiman, Timo

    2013-07-01

    Sustained progress in metabolic engineering methodologies has stimulated new efforts toward optimizing fungal production strains such as through metabolite analysis of Penicillium chrysogenum industrial-scale processes. Accurate intracellular metabolite quantification requires sampling procedures that rapidly stop metabolism (quenching) and avoid metabolite loss via the cell membrane (leakage). When sampling protocols are validated, the quenching efficiency is generally not quantitatively assessed. For fungal metabolomics, quantitative biomass separation using centrifugation is a further challenge. In this study, P. chrysogenum intracellular metabolites were quantified directly from biomass extracts using automated sampling and fast filtration. A master/slave bioreactor concept was applied to provide industrial production conditions. Metabolic activity during sampling was monitored by 13C tracing. Enzyme activities were efficiently stopped and metabolite leakage was absent. This work provides a reliable method for P. chrysogenum metabolomics and will be an essential base for metabolic engineering of industrial processes. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Groping for Quantitative Digital 3-D Image Analysis: An Approach to Quantitative Fluorescence In Situ Hybridization in Thick Tissue Sections of Prostate Carcinoma

    Directory of Open Access Journals (Sweden)

    Karsten Rodenacker

    1997-01-01

    Full Text Available In molecular pathology numerical chromosome aberrations have been found to be decisive for the prognosis of malignancy in tumours. The existence of such aberrations can be detected by interphase fluorescence in situ hybridization (FISH. The gain or loss of certain base sequences in the desoxyribonucleic acid (DNA can be estimated by counting the number of FISH signals per cell nucleus. The quantitative evaluation of such events is a necessary condition for a prospective use in diagnostic pathology. To avoid occlusions of signals, the cell nucleus has to be analyzed in three dimensions. Confocal laser scanning microscopy is the means to obtain series of optical thin sections from fluorescence stained or marked material to fulfill the conditions mentioned above. A graphical user interface (GUI to a software package for display, inspection, count and (semi‐automatic analysis of 3‐D images for pathologists is outlined including the underlying methods of 3‐D image interaction and segmentation developed. The preparative methods are briefly described. Main emphasis is given to the methodical questions of computer‐aided analysis of large 3‐D image data sets for pathologists. Several automated analysis steps can be performed for segmentation and succeeding quantification. However tumour material is in contrast to isolated or cultured cells even for visual inspection, a difficult material. For the present a fully automated digital image analysis of 3‐D data is not in sight. A semi‐automatic segmentation method is thus presented here.

  15. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  16. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    Science.gov (United States)

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  17. New tools for comparing microscopy images : Quantitative analysis of cell types in Bacillus subtilis

    NARCIS (Netherlands)

    van Gestel, Jordi; Vlamakis, Hera; Kolter, Roberto

    2015-01-01

    Fluorescence microscopy is a method commonly used to examine individual differences between bacterial cells, yet many studies still lack a quantitative analysis of fluorescence microscopy data. Here we introduce some simple tools that microbiologists can use to analyze and compare their microscopy

  18. Qualitative and Quantitative Analysis of Andrographis paniculata by Rapid Resolution Liquid Chromatography/Time-of-Flight Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Jian-Fei Qin

    2013-09-01

    Full Text Available A rapid resolution liquid chromatography/time-of-flight tandem mass spectrometry (RRLC-TOF/MS method was developed for qualitative and quantitative analysis of the major chemical constituents in Andrographis paniculata. Fifteen compounds, including flavonoids and diterpenoid lactones, were unambiguously or tentatively identified in 10 min by comparing their retention times and accurate masses with standards or literature data. The characteristic fragmentation patterns of flavonoids and diterpenoid lactones were summarized, and the structures of the unknown compounds were predicted. Andrographolide, dehydroandrographolide and neoandrographolide were further quantified as marker substances. It was found that the calibration curves for all analytes showed good linearity (R2 > 0.9995 within the test ranges. The overall limits of detection (LODs and limits of quantification (LOQs were 0.02 μg/mL to 0.06 μg/mL and 0.06 μg/mL to 0.2 μg/mL, respectively. The relative standard deviations (RSDs for intra- and inter-day precisions were below 3.3% and 4.2%, respectively. The mean recovery rates ranged from 96.7% to 104.5% with the relative standard deviations (RSDs less than 2.72%. It is concluded that RRLC-TOF/MS is powerful and practical in qualitative and quantitative analysis of complex plant samples due to time savings, sensitivity, precision, accuracy and lowering solvent consumption.

  19. Quantitative and comparative visualization applied to cosmological simulations

    International Nuclear Information System (INIS)

    Ahrens, James; Heitmann, Katrin; Habib, Salman; Ankeny, Lee; McCormick, Patrick; Inman, Jeff; Armstrong, Ryan; Ma, Kwan-Liu

    2006-01-01

    Cosmological simulations follow the formation of nonlinear structure in dark and luminous matter. The associated simulation volumes and dynamic range are very large, making visualization both a necessary and challenging aspect of the analysis of these datasets. Our goal is to understand sources of inconsistency between different simulation codes that are started from the same initial conditions. Quantitative visualization supports the definition and reasoning about analytically defined features of interest. Comparative visualization supports the ability to visually study, side by side, multiple related visualizations of these simulations. For instance, a scientist can visually distinguish that there are fewer halos (localized lumps of tracer particles) in low-density regions for one simulation code out of a collection. This qualitative result will enable the scientist to develop a hypothesis, such as loss of halos in low-density regions due to limited resolution, to explain the inconsistency between the different simulations. Quantitative support then allows one to confirm or reject the hypothesis. If the hypothesis is rejected, this step may lead to new insights and a new hypothesis, not available from the purely qualitative analysis. We will present methods to significantly improve the Scientific analysis process by incorporating quantitative analysis as the driver for visualization. Aspects of this work are included as part of two visualization tools, ParaView, an open-source large data visualization tool, and Scout, an analysis-language based, hardware-accelerated visualization tool

  20. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  1. Quantitative analysis of Al-Si alloy using calibration free laser induced breakdown spectroscopy (CF-LIBS)

    Science.gov (United States)

    Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali

    2017-06-01

    The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.

  2. US detection and classification of hepatic disease: Comparison of quantitative algorithms with clinical readings

    International Nuclear Information System (INIS)

    Insana, M.F.; Garra, B.S.; Shawker, T.H.; Wagner, R.F.; Bradford, M.; Russell, M.A.

    1986-01-01

    A method of quantitative digital analysis of US B-scans is used to differentiate between normal and diseased liver in vivo. The tissue signature is based on five measured parameters: four describe the tissue structure and scattering properties, the fifth is the US attenuation. The patient groups studied included 31 healthy subjects, 97 patients with chronic active hepatitis, 62 with Gaucher disease, and 10 with lymphomas. Receiver operating characteristic curve analysis was used to compare the diagnostic performance of the quantitative method with the clinical reading of trained observers. The quantitative method showed greater diagnostic capability for detecting and classifying diffuse and some focal disease

  3. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  4. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  5. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk.

    Science.gov (United States)

    Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan

    2013-01-04

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  6. Quantitative Structure Activity Relationship and Risk Analysis of Some Pesticides in the Goat milk

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2013-01-01

    Full Text Available The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean+/-SEM levels (ppm of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34+/-0.007, 0.063+/-0.002, 0.034+/-0.002 and 0.092+/-0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW, melting point (MP, and log octanol to water partition coefficient (Ko/w in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985 for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  7. Quantitative structural analysis of lignin by diffuse reflectance fourier transform infrared spectrometry

    International Nuclear Information System (INIS)

    Schultz, T.P.; Glasser, W.G.

    1986-01-01

    Empirical quantitative relationships were established between infrared (IR) spectral information and several structural features in lignins as determined by conventional methods. The structural composition of average phenylpropane (C g ) units which significantly correlated (0.01 level) with IR peak intensities included methoxy content, aromatic hydrogen content, phenolic hydroxy content, guaiacyl/syringyl ratio, and ''hydrolysis'' and ''condensation'' ratios

  8. Scanning fluorescent microscopy is an alternative for quantitative fluorescent cell analysis.

    Science.gov (United States)

    Varga, Viktor Sebestyén; Bocsi, József; Sipos, Ferenc; Csendes, Gábor; Tulassay, Zsolt; Molnár, Béla

    2004-07-01

    Fluorescent measurements on cells are performed today with FCM and laser scanning cytometry. The scientific community dealing with quantitative cell analysis would benefit from the development of a new digital multichannel and virtual microscopy based scanning fluorescent microscopy technology and from its evaluation on routine standardized fluorescent beads and clinical specimens. We applied a commercial motorized fluorescent microscope system. The scanning was done at 20 x (0.5 NA) magnification, on three channels (Rhodamine, FITC, Hoechst). The SFM (scanning fluorescent microscopy) software included the following features: scanning area, exposure time, and channel definition, autofocused scanning, densitometric and morphometric cellular feature determination, gating on scatterplots and frequency histograms, and preparation of galleries of the gated cells. For the calibration and standardization Immuno-Brite beads were used. With application of shading compensation, the CV of fluorescence of the beads decreased from 24.3% to 3.9%. Standard JPEG image compression until 1:150 resulted in no significant change. The change of focus influenced the CV significantly only after +/-5 microm error. SFM is a valuable method for the evaluation of fluorescently labeled cells. Copyright 2004 Wiley-Liss, Inc.

  9. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    Science.gov (United States)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  11. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  13. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography

    International Nuclear Information System (INIS)

    Turmezei, Tom D.; Treece, Graham M.; Gee, Andrew H.; Fotiadou, Anastasia F.; Poole, Kenneth E.S.

    2016-01-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K and L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K and L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. (orig.)

  14. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  15. Visualisation and quantitative analysis of the rodent malaria liver stage by real time imaging.

    NARCIS (Netherlands)

    Ploemen, I.H.J.; Prudencio, M.; Douradinha, B.G.; Ramesar, J.; Fonager, J.; Gemert, G.J.A. van; Luty, A.J.F.; Hermsen, C.C.; Sauerwein, R.W.; Baptista, F.G.; Mota, M.M.; Waters, A.P.; Que, I.; Lowik, C.W.G.M.; Khan, S.M.; Janse, C.J.; Franke-Fayard, B.

    2009-01-01

    The quantitative analysis of Plasmodium development in the liver in laboratory animals in cultured cells is hampered by low parasite infection rates and the complicated methods required to monitor intracellular development. As a consequence, this important phase of the parasite's life cycle has been

  16. Exploring Valid Reference Genes for Quantitative Real-time PCR Analysis in Plutella xylostella (Lepidoptera: Plutellidae)

    Science.gov (United States)

    Fu, Wei; Xie, Wen; Zhang, Zhuo; Wang, Shaoli; Wu, Qingjun; Liu, Yong; Zhou, Xiaomao; Zhou, Xuguo; Zhang, Youjun

    2013-01-01

    Abstract: Quantitative real-time PCR (qRT-PCR), a primary tool in gene expression analysis, requires an appropriate normalization strategy to control for variation among samples. The best option is to compare the mRNA level of a target gene with that of reference gene(s) whose expression level is stable across various experimental conditions. In this study, expression profiles of eight candidate reference genes from the diamondback moth, Plutella xylostella, were evaluated under diverse experimental conditions. RefFinder, a web-based analysis tool, integrates four major computational programs including geNorm, Normfinder, BestKeeper, and the comparative ΔCt method to comprehensively rank the tested candidate genes. Elongation factor 1 (EF1) was the most suited reference gene for the biotic factors (development stage, tissue, and strain). In contrast, although appropriate reference gene(s) do exist for several abiotic factors (temperature, photoperiod, insecticide, and mechanical injury), we were not able to identify a single universal reference gene. Nevertheless, a suite of candidate reference genes were specifically recommended for selected experimental conditions. Our finding is the first step toward establishing a standardized qRT-PCR analysis of this agriculturally important insect pest. PMID:23983612

  17. The incidence of Grey Literature in online databases : a quantitative analysis

    OpenAIRE

    Luzi, Daniela (CNR-ISRDS); GreyNet, Grey Literature Network Service

    1994-01-01

    This study aims to verify the diffusion and distribution of Grey Literature (GL) documents in commercially available online databases. It has been undertaken due to the growing importance of GL in the field of information and documentation, on the one hand, and the increasing supply of online databases, on the other hand. The work is divided into two parts. The first provides the results of a previous quantitative analysis of databases containing GL documents. Using a top-down methodology, i....

  18. Effects of formic acid hydrolysis on the quantitative analysis of radiation-induced DNA base damage products assayed by gas chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Swarts, S.G.; Smith, G.S.; Miao, L.; Wheeler, K.T.

    1996-01-01

    Gas chromatography/mass spectrometry (GC/ MS-SIM) is an excellent technique for performing both qualitative and quantitative analysis of DNA base damage products that are formed by exposure to ionizing radiation or by the interaction of intracellular DNA with activated oxygen species. This technique commonly uses a hot formic acid hydrolysis step to degrade the DNA to individual free bases. However, due to the harsh nature of this degradation procedure, the quantitation of DNA base damage products may be adversely affected. Consequently, we examined the effects of various formic acid hydrolysis procedures on the quantitation of a number of DNA base damage products and identified several factors that can influence this quantitation. These factors included (1) the inherent acid stabilities of both the lesions and the internal standards; (2) the hydrolysis temperature; (3) the source and grade of the formic acid; and (4) the sample mass during hydrolysis. Our data also suggested that the N, O-bis (trimethylsilyl)trifluoroacetamide (BSTFA) derivatization efficiency can be adversely affected, presumably by trace contaminants either in the formic acid or from the acid-activated surface of the glass derivatization vials. Where adverse effects were noted, modifications were explored in an attempt to improve the quantitation of these DNA lesions. Although experimental steps could be taken to minimize the influence of these factors on the quantitation of some base damage products, no single procedure solved the quantitation problem for all base lesions. However, a significant improvement in the quantitation was achieved if the relative molecular response factor (RMRF) values for these lesions were generated with authentic DNA base damage products that had been treated exactly like the experimental samples. (orig.)

  19. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    Directory of Open Access Journals (Sweden)

    Jamshed Haneef

    2013-10-01

    Full Text Available Liquid chromatography tandem mass chromatography (LC–MS/MS is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LC–MS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase as well as different acquisition modes (SIM, MRM for quantitative analysis of glucocorticoids and stimulants. This review is not meant to be exhaustive but rather to provide a general overview for detection and confirmation of target drugs using LC–MS/MS and thus useful in the doping analysis, toxicological studies as well as in pharmaceutical analysis. Keywords: LC–MS/MS, Ionization techniques, Glucocorticoids, Stimulants, Hyphenated techniques, Biological fluid

  20. Quantitative CT analysis of pulmonary ground-glass opacity nodules for distinguishing invasive adenocarcinoma from non-invasive or minimally invasive adenocarcinoma: the added value of using iodine mapping.

    Science.gov (United States)

    Son, Ji Ye; Lee, Ho Yun; Kim, Jae-Hun; Han, Joungho; Jeong, Ji Yun; Lee, Kyung Soo; Kwon, O Jung; Shim, Young Mog

    2016-01-01

    To determine whether quantitative analysis of iodine-enhanced images generated from dual-energy CT (DECT) have added value in distinguishing invasive adenocarcinoma from non-invasive or minimally invasive adenocarcinoma (MIA) showing ground-glass nodule (GGN). Thirty-four patients with 39 GGNs were enrolled in this prospective study and underwent DECT followed by complete tumour resection. Various quantitative imaging parameters were assessed, including virtual non-contrast (VNC) imaging and iodine-enhanced imaging. Of all 39 GGNs, four were adenocarcinoma in situ (AIS) (10 %), nine were MIA (23 %), and 26 were invasive adenocarcinoma (67 %). When assessing only VNC imaging, multivariate analysis revealed that mass, uniformity, and size-zone variability were independent predictors of invasive adenocarcinoma (odds ratio [OR] = 19.92, P = 0.02; OR = 0.70, P = 0.01; OR = 16.16, P = 0.04, respectively). After assessing iodine-enhanced imaging with VNC imaging, both mass on the VNC imaging and uniformity on the iodine-enhanced imaging were independent predictors of invasive adenocarcinoma (OR = 5.51, P = 0.04 and OR = 0.67, P VNC imaging alone, from 0.888 to 0.959, respectively (P = 0.029). Quantitative analysis using iodine-enhanced imaging metrics versus VNC imaging metrics alone generated from DECT have added value in distinguishing invasive adenocarcinoma from AIS or MIA. Quantitative analysis using DECT was used to distinguish invasive adenocarcinoma. Tumour mass and uniformity were independent predictors of invasive adenocarcinoma. Diagnostic performance was improved after adding iodine parameters to VNC parameters.

  1. Quantitative analysis of red wine tannins using Fourier-transform mid-infrared spectrometry.

    Science.gov (United States)

    Fernandez, Katherina; Agosin, Eduardo

    2007-09-05

    Tannin content and composition are critical quality components of red wines. No spectroscopic method assessing these phenols in wine has been described so far. We report here a new method using Fourier transform mid-infrared (FT-MIR) spectroscopy and chemometric techniques for the quantitative analysis of red wine tannins. Calibration models were developed using protein precipitation and phloroglucinolysis as analytical reference methods. After spectra preprocessing, six different predictive partial least-squares (PLS) models were evaluated, including the use of interval selection procedures such as iPLS and CSMWPLS. PLS regression with full-range (650-4000 cm(-1)), second derivative of the spectra and phloroglucinolysis as the reference method gave the most accurate determination for tannin concentration (RMSEC = 2.6%, RMSEP = 9.4%, r = 0.995). The prediction of the mean degree of polymerization (mDP) of the tannins also gave a reasonable prediction (RMSEC = 6.7%, RMSEP = 10.3%, r = 0.958). These results represent the first step in the development of a spectroscopic methodology for the quantification of several phenolic compounds that are critical for wine quality.

  2. Quantitative, multiplexed workflow for deep analysis of human blood plasma and biomarker discovery by mass spectrometry.

    Science.gov (United States)

    Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A

    2017-08-01

    Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of 4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.

  3. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  4. Quantitative EDXS: Influence of geometry on a four detector system

    International Nuclear Information System (INIS)

    Kraxner, Johanna; Schäfer, Margit; Röschel, Otto; Kothleitner, Gerald; Haberfehlner, Georg; Paller, Manuel; Grogger, Werner

    2017-01-01

    The influence of the geometry on quantitative energy dispersive X-ray spectrometry (EDXS) analysis is determined for a ChemiSTEM system (Super-X) in combination with a low-background double-tilt specimen holder. For the first time a combination of experimental measurements with simulations is used to determine the positions of the individual detectors of a Super-X system. These positions allow us to calculate the detector's solid angles and estimate the amount of detector shadowing and its influence on quantitative EDXS analysis, including absorption correction using the ζ-factor method. Both shadowing by the brass portions and the beryllium specimen carrier of the holder severely affect the quantification of low to medium atomic number elements. A multi-detector system is discussed in terms of practical consequences of the described effects, and a quantitative evaluation of a Fayalit sample is demonstrated. Corrections and suggestions for minimizing systematic errors are discussed to improve quantitative methods for a multi-detector system. - Highlights: • Geometrical issues for EDXS quantification on a Super-X system. • Realistic model of a specimen holder using X-ray computed tomography. • Determination of the exact detector positions of a Super-X system. • Influence of detector shadowing and Be specimen carrier on quantitative EDXS.

  5. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  6. New Approaches to Transport Project Assessment: Reference Scenario Forecasting and Quantitative Risk Analysis

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2010-01-01

    however has proved that the point estimates derived from such analyses are embedded with a large degree of uncertainty. Thus, a new scheme was proposed in terms of applying quantitative risk analysis (QRA) and Monte Carlo simulation in order to represent the uncertainties within the cost-benefit analysis....... Additionally, the handling of uncertainties is supplemented by making use of the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits (user demands i.e. travel time savings) and underestimating investment costs....

  7. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  8. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    Science.gov (United States)

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely

  9. An XRD technique for quantitative phase analysis of Al-U-Zr alloy

    International Nuclear Information System (INIS)

    Khan, K.B.; Kulkarni, N.K.; Jain, G.C.

    2003-01-01

    In several nuclear research reactors all over the world, Al-U alloy is used as fuel. To stabilise less brittle phase UAl 3 in Al-U alloy, a small amount of Zr (1 to 3 wt% ) is added. A rapid, non destructive and simple x-ray diffraction technique has been developed for quantitative phase analysis Al-U-Zr alloy system containing UAl 4 , UAl 3 and Al. (author)

  10. Quantitative analysis of phases by x-ray diffraction and thermogravimetry in Cuban phosphorite ores

    International Nuclear Information System (INIS)

    Casanova Gomez, Abdel; Martinez Montalvo, Asor; Cilano Campos, Guillermo; Arostegui Aguirre, Miladys; Ferreiro Fernandez, Adalyz; Alonso Perez, Jose A.

    2016-01-01

    Phases analysis is performed by instrumental techniques X - ray diffraction and Thermal Analysis in two groups of samples of Cuban minerals carriers'phosphorus, candidates to reference materials. To this end, the variant of structural refinement of the diffraction pattern in the form of adjustment profile is applied, using the Full prof program of Juan Rodriguez-Carvajal. This analysis is the first step to develop the standard specification of these resources and classify them as phosphate rock and / or phospharite from their mass content. The statistical evaluation of the uncertainty of the quantitative analysis (standard deviation) was carried out in ten replicate samples of phosphate rock and eight of phosphate from the field Trinidad de Guedes. The qualitative phase analysis reflected the following phase composition: carbonate fluoroapatite (CFA), Calcite, Quartz and Halloysite (present only in the clayey granular phosphorite ore; FGA). By the method of setting pattern powder diffraction profile, the quantitative phase composition is reported in the sample FGA: 87 (2) % of CFA, 4 (1) % of Calcite, 1% Quartz, and 8 (3) % Halloysite. For granular limestone ore (FGC), the following contents were obtained: 87 (3) % Calcite, 8 (3) % of CFA and 5 (1) % Quartz: The obtained values are corroborated by Thermogravimetric Analysis (TG) through the calculation of the mass content of the thermally active phases (Calcite and CFA) in the range (27-10000 0 C), confirming the validity of the results of XRD. (Author)

  11. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  12. Meta-Analysis of Results from Quantitative Trait Loci Mapping Studies on Pig Chromosome 4

    NARCIS (Netherlands)

    Moraes Silva, De K.M.; Bastiaansen, J.W.M.; Knol, E.F.; Merks, J.W.M.; Lopes, P.S.; Guimaraes, R.M.; Arendonk, van J.A.M.

    2011-01-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this

  13. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  14. A Simple Linear Regression Method for Quantitative Trait Loci Linkage Analysis With Censored Observations

    OpenAIRE

    Anderson, Carl A.; McRae, Allan F.; Visscher, Peter M.

    2006-01-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using...

  15. Quantitative imaging of protein targets in the human brain with PET

    International Nuclear Information System (INIS)

    Gunn, Roger N; Slifstein, Mark; Searle, Graham E; Price, Julie C

    2015-01-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  16. Quantitative imaging of protein targets in the human brain with PET

    Science.gov (United States)

    Gunn, Roger N.; Slifstein, Mark; Searle, Graham E.; Price, Julie C.

    2015-11-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  17. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  18. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    Science.gov (United States)

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    Science.gov (United States)

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. © 2013 Elsevier Ltd. All rights reserved.

  20. Quantitative analysis of 39 polybrominated diphenyl ethers by isotope dilution GC/low-resolution MS.

    Science.gov (United States)

    Ackerman, Luke K; Wilson, Glenn R; Simonich, Staci L

    2005-04-01

    A GC/low-resolution MS method for the quantitative isotope dilution analysis of 39 mono- to heptabrominated diphenyl ethers was developed. The effects of two different ionization sources, electron impact (EI) and electron capture negative ionization (ECNI), and the effects of their parameters on production of high-mass fragment ions [M - xH - yBr](-) specific to PBDEs were investigated. Electron energy, emission current, source temperature, ECNI system pressure, and choice of ECNI reagent gases were optimized. Previously unidentified enhancement of PBDE high-mass fragment ion [M - xH - yBr](-) abundance was achieved. Electron energy had the largest impact on PBDE high-mass fragment ion abundance for both the ECNI and EI sources. By monitoring high-mass fragment ions of PBDEs under optimized ECNI source conditions, quantitative isotope dilution analysis of 39 PBDEs was conducted using nine (13)C(12) labeled PBDEs on a low-resolution MS with low picogram to femtogram instrument detection limits.

  1. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    Science.gov (United States)

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Assessment of a synchrotron X-ray method for quantitative analysis of calcium hydroxide

    International Nuclear Information System (INIS)

    Williams, P. Jason; Biernacki, Joseph J.; Bai Jianming; Rawn, Claudia J.

    2003-01-01

    Thermogravimetric analysis (TGA) and quantitative X-ray diffraction (QXRD) are widely used to determine the calcium hydroxide (CH) content in cementitious systems containing blends of Portland cement, fly ash, blast furnace slag, silica fume and other pozzolanic and hydraulic materials. These techniques, however, are destructive to cement samples and subject to various forms of error. While precise weight losses can be measured by TGA, extracting information from samples with multiple overlapping thermal events is difficult. And, however, while QXRD can offer easier deconvolution, the accuracy for components below about 5 wt.% is typically poor when a laboratory X-ray source is used. Furthermore, the destructive nature of both techniques prevents using them to study the in situ hydration of a single contiguous sample for kinetic analysis. In an attempt to overcome these problems, the present research evaluated the use of synchrotron X-rays for quantitative analysis of CH. A synchrotron X-ray source was used to develop calibration data for quantification of the amount of CH in mixtures with fly ash. These data were compared to conventional laboratory XRD data for like samples. While both methods were found to offer good quantification, synchrotron XRD (SXRD) provided a broader range of detectability and higher accuracy than laboratory diffraction and removed the subjectivity as compared to TGA analysis. Further, the sealed glass capillaries used with the synchrotron source provided a nondestructive closed, in situ environment for tracking hydrating specimens from zero to any desired age

  3. Quantitative analysis of the epitaxial recrystallization effect induced by swift heavy ions in silicon carbide

    International Nuclear Information System (INIS)

    Benyagoub, A.

    2015-01-01

    This paper discusses recent results on the recrystallization effect induced by swift heavy ions (SHI) in pre-damaged silicon carbide. The recrystallization kinetics was followed by using increasing SHI fluences and by starting from different levels of initial damage within the SiC samples. The quantitative analysis of the data shows that the recrystallization rate depends drastically on the local amount of crystalline material: it is nil in fully amorphous regions and becomes more significant with increasing amount of crystalline material. For instance, in samples initially nearly half-disordered, the recrystallization rate per incident ion is found to be 3 orders of magnitude higher than what it is observed with the well-known IBIEC process using low energy ions. This high rate can therefore not be accounted for by the existing IBIEC models. Moreover, decreasing the electronic energy loss leads to a drastic reduction of the recrystallization rate. A comprehensive quantitative analysis of all the experimental results shows that the SHI induced high recrystallization rate can only be explained by a mechanism based on the melting of the amorphous zones through a thermal spike process followed by an epitaxial recrystallization initiated from the neighboring crystalline regions if the size of the latter exceeds a certain critical value. This quantitative analysis also reveals that recent molecular dynamics calculations supposed to reproduce this phenomenon are wrong since they overestimated the recrystallization rate by a factor ∼40.

  4. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  5. MEASURING ORGANIZATIONAL CULTURE: A QUANTITATIVE-COMPARATIVE ANALYSIS [doi: 10.5329/RECADM.20100902007

    Directory of Open Access Journals (Sweden)

    Valderí de Castro Alcântara

    2010-11-01

    Full Text Available This article aims at the analysis of the organizational culture at enterprises located in two towns with distinct quantitative traits, Rio Paranaíba and Araxá. While the surveyed enterprises in Rio Paranaíba are mostly micro and small enterprises (86%, in Araxá there are mostly medium and large companies (53%. The overall objective is to verify if there are significant differences in organizational culture among these enterprises and if they can be explained by the organization size. The research was quantitative and instruments for data collection were a questionnaire and a scale for measuring organizational culture containing four dimensions: Hierarchical Distance Index (IDH, Individualism Index (INDI, Masculinity Index (MASC and the Uncertainty Control Index (CINC. Tabulation and analysis of data were performed using the PASW Statistics 18, doing descriptive and inferential statistical procedures. Using a Reduction Factor (-21 the achieved indexes were classified into 5 intensity categories (from "very low" to "very high". The Student t test for two means was performed, revealing significant differences in Hierarchical Distance and Individualism between Araxá and Rio Paranaíba enterprises (p <0.05.   Keywords Organizational Culture; Dimensions of Organizational Culture; Araxá; Rio Paranaíba.

  6. Quantitative analysis of perfumes in talcum powder by using headspace sorptive extraction.

    Science.gov (United States)

    Ng, Khim Hui; Heng, Audrey; Osborne, Murray

    2012-03-01

    Quantitative analysis of perfume dosage in talcum powder has been a challenge due to interference of the matrix and has so far not been widely reported. In this study, headspace sorptive extraction (HSSE) was validated as a solventless sample preparation method for the extraction and enrichment of perfume raw materials from talcum powder. Sample enrichment is performed on a thick film of poly(dimethylsiloxane) (PDMS) coated onto a magnetic stir bar incorporated in a glass jacket. Sampling is done by placing the PDMS stir bar in the headspace vial by using a holder. The stir bar is then thermally desorbed online with capillary gas chromatography-mass spectrometry. The HSSE method is based on the same principles as headspace solid-phase microextraction (HS-SPME). Nevertheless, a relatively larger amount of extracting phase is coated on the stir bar as compared to SPME. Sample amount and extraction time were optimized in this study. The method has shown good repeatability (with relative standard deviation no higher than 12.5%) and excellent linearity with correlation coefficients above 0.99 for all analytes. The method was also successfully applied in the quantitative analysis of talcum powder spiked with perfume at different dosages. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  8. Quantitative Analysis of "1"8F-Fluorodeoxyglucose Positron Emission Tomography Identifies Novel Prognostic Imaging Biomarkers in Locally Advanced Pancreatic Cancer Patients Treated With Stereotactic Body Radiation Therapy

    International Nuclear Information System (INIS)

    Cui, Yi; Song, Jie; Pollom, Erqi; Alagappan, Muthuraman; Shirato, Hiroki; Chang, Daniel T.; Koong, Albert C.; Li, Ruijiang

    2016-01-01

    Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT "1"8F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162 robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6

  9. Quantitative Analysis of {sup 18}F-Fluorodeoxyglucose Positron Emission Tomography Identifies Novel Prognostic Imaging Biomarkers in Locally Advanced Pancreatic Cancer Patients Treated With Stereotactic Body Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yi [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Song, Jie; Pollom, Erqi; Alagappan, Muthuraman [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Shirato, Hiroki [Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Chang, Daniel T.; Koong, Albert C. [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Stanford Cancer Institute, Stanford, California (United States); Li, Ruijiang, E-mail: rli2@stanford.edu [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Stanford Cancer Institute, Stanford, California (United States)

    2016-09-01

    Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT {sup 18}F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162 robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6

  10. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  11. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  12. Quantitative mineralogical analysis of sandstones using x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Ward, C.R.; Taylor, J.C.

    1999-01-01

    Full text: X-ray diffraction has long been used as a definitive technique for mineral identification based on the measuring the internal atomic or crystal structures present in powdered rocks; soils and other mineral mixtures. Recent developments in data gathering and processing, however, have provided an improved basis for its use as a quantitative tool, determining not only the nature of the minerals but also the relative proportions of the different minerals present. The mineralogy of a series of sandstone samples from the Sydney and Bowen Basins of eastern Australia has been evaluated by X-ray diffraction (XRD) on a quantitative basis using the Australian-developed SIROQUANT data processing technique. Based on Rietveld principles, this technique generates a synthetic X-ray diffractogram by adjusting and combining full-profile patterns of minerals nominated as being present in the sample and interactively matches the synthetic diffractogram under operator instructions to the observed diffractogram of the sample being analysed. The individual mineral patterns may be refined in the process, to allow for variations in crystal structure of individual components or for factors such as preferred orientation in the sample mount. The resulting output provides mass percentages of the different minerals in the mixture, and an estimate of the error associated with each individual percentage determination. The chemical composition of the mineral mixtures indicated by SIROQUANT for each individual sandstone studied was estimated using a spreadsheet routine, and the indicated proportion of each oxide in each sample compared to the actual chemical analysis of the same sandstone as determined independently by X-ray fluorescence spectrometry. The results show a high level of agreement for all major chemical constituents, indicating consistency between the SIROQUANT XRD data and the whole-rock chemical composition. Supplementary testing with a synthetic corundum spike further

  13. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  14. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  15. Analysis of Experts’ Quantitative Assessment of Adolescent Basketball Players and the Role of Anthropometric and Physiological Attributes

    Directory of Open Access Journals (Sweden)

    Štrumbelj Erik

    2014-10-01

    Full Text Available In this paper, we investigated two questions: (1 can measurements of anthropometric and physiological attributes substitute for expert assessment of adolescent basketball players, and (2 how much does the quantitative assessment of a player vary among experts? The first question is relevant to the potential simplification of the player selection process. The second question pertains directly to the validity of expert quantitative assessment. Our research was based on data from 148 U14 female and male basketball players. For each player, an array of anthropometric and physiological attributes was recorded, including body height, body mass, BMI, and several motor skill tests. Furthermore, each player's current ability and potential ability were quantitatively evaluated by two different experts from a group of seven experts. Analysis of the recorded data showed that the anthropometric and physiological attributes explained between 15% and 40% of the variance in experts’ scores. The primary predictive attributes were speed and agility (for predicting current ability and body height and growth potential (for predicting potential ability. We concluded that these attributes were not sufficiently informative to act as a substitute for expert assessment of the players’ current or potential ability. There is substantial variability in different experts' scores of the same player’s ability. However, the differences between experts are mostly in scale, and the relationships between experts’ scores are monotonic. That is, different experts rank players on ability very similarly, but their scores are not well calibrated.

  16. Problems with the quantitative spectroscopic analysis of oxygen rich Czech coals

    Energy Technology Data Exchange (ETDEWEB)

    Pavlikova, H.; Machovic, V.; Cerny, J. [Inst. of Chemical Technology, Prague (Czechoslovakia); Sebestova, E. [Inst. of Rock Structure and Mechanics, Prague (Czechoslovakia)

    1995-12-01

    Solid state NMR and FTIR spectroscopies are two main methods used for the structural analysis of coals and their various products. Obtaining quantitative parameters from coals, such as arornaticity (f{sub a}) by the above mentioned methods can be a rather difficult task. Coal samples of various rank were chosen for the quantitative NMR, FTIR and EPR analyses. The aromaticity was obtained by the FTIR, {sup 13}C CP/MAS and SP/MAS NMR experiments. The content of radicals and saturation characteristics of coals were measured by EPR spectroscopy. The following problems have been discussed: 1. The relationship between the amount of free radicals (N{sub g}) and f{sub a} by NMR. 2. The f{sub a} obtained by solid state NMR and FTIR spectroscopies. 3. The differences between the f{sub a} measured by CP and SP/NMR experiments. 4. The relationship between the content of oxygen groups and the saturation responses of coals. The reliability of our results was checked by measuring the structural parameters of Argonne premium coals.

  17. Quantitative EEG analysis in minimally conscious state patients during postural changes.

    Science.gov (United States)

    Greco, A; Carboncini, M C; Virgillito, A; Lanata, A; Valenza, G; Scilingo, E P

    2013-01-01

    Mobilization and postural changes of patients with cognitive impairment are standard clinical practices useful for both psychic and physical rehabilitation process. During this process, several physiological signals, such as Electroen-cephalogram (EEG), Electrocardiogram (ECG), Photopletysmography (PPG), Respiration activity (RESP), Electrodermal activity (EDA), are monitored and processed. In this paper we investigated how quantitative EEG (qEEG) changes with postural modifications in minimally conscious state patients. This study is quite novel and no similar experimental data can be found in the current literature, therefore, although results are very encouraging, a quantitative analysis of the cortical area activated in such postural changes still needs to be deeply investigated. More specifically, this paper shows EEG power spectra and brain symmetry index modifications during a verticalization procedure, from 0 to 60 degrees, of three patients in Minimally Consciousness State (MCS) with focused region of impairment. Experimental results show a significant increase of the power in β band (12 - 30 Hz), commonly associated to human alertness process, thus suggesting that mobilization and postural changes can have beneficial effects in MCS patients.

  18. Coupling Reagent for UV/vis Absorbing Azobenzene-Based Quantitative Analysis of the Extent of Functional Group Immobilization on Silica.

    Science.gov (United States)

    Choi, Ra-Young; Lee, Chang-Hee; Jun, Chul-Ho

    2018-05-18

    A methallylsilane coupling reagent, containing both a N-hydroxysuccinimidyl(NHS)-ester group and a UV/vis absorbing azobenzene linker undergoes acid-catalyzed immobilization on silica. Analysis of the UV/vis absorption band associated with the azobenzene group in the adduct enables facile quantitative determination of the extent of loading of the NHS groups. Reaction of NHS-groups on the silica surface with amine groups of GOx and rhodamine can be employed to generate enzyme or dye-immobilized silica for quantitative analysis.

  19. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  20. Quantitative X-ray microanalysis of biological specimens

    International Nuclear Information System (INIS)

    Roomans, G.M.

    1988-01-01

    Qualitative X-ray microanalysis of biological specimens requires an approach that is somewhat different from that used in the materials sciences. The first step is deconvolution and background subtraction on the obtained spectrum. The further treatment depends on the type of specimen: thin, thick, or semithick. For thin sections, the continuum method of quantitation is most often used, but it should be combined with an accurate correction for extraneous background. However, alternative methods to determine local mass should also be considered. In the analysis of biological bulk specimens, the ZAF-correction method appears to be less useful, primarily because of the uneven surface of biological specimens. The peak-to-local background model may be a more adequate method for thick specimens that are not mounted on a thick substrate. Quantitative X-ray microanalysis of biological specimens generally requires the use of standards that preferably should resemble the specimen in chemical and physical properties. Special problems in biological microanalysis include low count rates, specimen instability and mass loss, extraneous contributions to the spectrum, and preparative artifacts affecting quantitation. A relatively recent development in X-ray microanalysis of biological specimens is the quantitative determination of local water content