WorldWideScience

Sample records for analysis quantitative chemical

  1. Coulometry in quantitative chemical analysis and physico-chemical research

    International Nuclear Information System (INIS)

    Electroanalytical methods such as potentiometry, amperometry, coulometry and voltammetry are well established and routinely employed in quantitative chemical analysis as well as in chemical research. Coulometry is one of the most important electroanalytical techniques, which involves change in oxidation state of electro active species by heterogeneous electron transfer. In primary coulometric method, uranium is determined at mercury pool electrode and plutonium at platinum gauze electrode

  2. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    Science.gov (United States)

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis. PMID:19821113

  3. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  4. Quantitative analysis of abused drugs in physiological fluids by gas chromatography/chemical ionization mass spectrometry

    International Nuclear Information System (INIS)

    Methods have been developed for quantitative analysis of commonly abused drugs in physiological fluids using gas chromatography/chemical ionization mass spectrometry. The methods are being evaluated in volunteer analytical and toxicological laboratories, and analytical manuals describing the methods are being prepared. The specific drug and metabolites included in this program are: Δ9-tetrahydrocannabinol, methadone, phencyclidine, methaqualone, morphine, amphetamine, methamphetamine, mescaline, 2,5-dimethoxy-4-methyl amphetamine, cocaine, benzoylecgonine, diazepam, and N-desmethyldiazepam. The current analytical methods utilize relatively conventional instrumentation and procedures, and are capable of measuring drug concentrations as low as 1 ng/ml. Various newer techniques such as sample clean-up by high performance liquid chromatography, separation by glass capillary chromatography, and ionization by negative ion chemical ionization are being investigated with respect to their potential for achieving higher sensitivity and specificity, as well as their ability to facilitate simultaneous analysis of more than one drug and metabolite. (Auth.)

  5. A chemical profiling strategy for semi-quantitative analysis of flavonoids in Ginkgo extracts.

    Science.gov (United States)

    Yang, Jing; Wang, An-Qi; Li, Xue-Jing; Fan, Xue; Yin, Shan-Shan; Lan, Ke

    2016-05-10

    Flavonoids analysis in herbal products is challenged by their vast chemical diversity. This work aimed to develop a chemical profiling strategy for the semi-quantification of flavonoids using extracts of Ginkgo biloba L. (EGB) as an example. The strategy was based on the principle that flavonoids in EGB have an almost equivalent molecular absorption coefficient at a fixed wavelength. As a result, the molecular-contents of flavonoids were able to be semi-quantitatively determined by the molecular-concentration calibration curves of common standards and recalculated as the mass-contents with the characterized molecular weight (MW). Twenty batches of EGB were subjected to HPLC-UV/DAD/MS fingerprinting analysis to test the feasibility and reliability of this strategy. The flavonoid peaks were distinguished from the other peaks with principle component analysis and Pearson correlation analysis of the normalized UV spectrometric dataset. Each flavonoid peak was subsequently tentatively identified by the MS data to ascertain their MW. It was highlighted that the flavonoids absorption at Band-II (240-280 nm) was more suitable for the semi-quantification purpose because of the less variation compared to that at Band-I (300-380 nm). The semi-quantification was therefore conducted at 254 nm. Beyond the qualitative comparison results acquired by common chemical profiling techniques, the semi-quantitative approach presented the detailed compositional information of flavonoids in EGB and demonstrated how the adulteration of one batch was achieved. The developed strategy was believed to be useful for the advanced analysis of herbal extracts with a high flavonoid content without laborious identification and isolation of individual components. PMID:26907698

  6. GC-FID coupled with chemometrics for quantitative and chemical fingerprinting analysis of Alpinia oxyphylla oil.

    Science.gov (United States)

    Miao, Qing; Kong, Weijun; Zhao, Xiangsheng; Yang, Shihai; Yang, Meihua

    2015-01-01

    Analytical methods for quantitative analysis and chemical fingerprinting of volatile oils from Alpinia oxyphylla were established. The volatile oils were prepared by hydrodistillation, and the yields were between 0.82% and 1.33%. The developed gas chromatography-flame ionization detection (GC-FID) method showed good specificity, linearity, reproducibility, stability and recovery, and could be used satisfactorily for quantitative analysis. The results showed that the volatile oils contained 2.31-77.30 μL/mL p-cymene and 12.38-99.34 mg/mL nootkatone. A GC-FID fingerprinting method was established, and the profiles were analyzed using chemometrics. GC-MS was used to identify the principal compounds in the GC-FID profiles. The profiles of almost all the samples were consistent and stable. The harvesting time and source were major factors that affected the profile, while the volatile oil yield and the nootkatone content had minor secondary effects. PMID:25459943

  7. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    Science.gov (United States)

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. PMID:27373704

  8. Analysis of abused drugs by selected ion monitoring: quantitative comparison of electron impact and chemical ionization

    International Nuclear Information System (INIS)

    A comparison was made of the relative sensitivities of electron impact and chemical ionization when used for selected ion monitoring analysis of commonly abused drugs. For most of the drugs examined chemical ionization using ammonia as the reactant gas gave the largest single m/e ion current response per unit weight of sample. However, if maximum sensitivity is desired it is important to evaluate electron impact and chemical ionization with respect to both maximum response and degree of interference from background and endogenous materials

  9. Quantitative analysis of chemical elements in single cells using nuclear microprobe and nano-probe

    International Nuclear Information System (INIS)

    The study of the role of trace elements at cellular level requires the use of state-of-the-art analytical tools that could achieve enough sensitivity and spatial resolution. We developed a new methodology for the accurate quantification of chemical element distribution in single cells based on a combination of ion beam analysis techniques STIM, PIXE and RBS. The quantification procedure relies on the development of a STIM data analysis software (Paparamborde). Validity of this methodology and limits are discussed here. The method allows the quantification of trace elements (μg/g) with a 19.8 % uncertainty in cellular compartments with mass below 0.1 ng. The main limit of the method lies in the poor number of samples that can be analyzed, due to long irradiation times required and limited access to ion beam analysis facilities. This is the reason why we developed a database for cellular chemical composition capitalization (BDC4). BDC4 has been designed in order to use cellular chemical composition as a tracer for biological activities and is expected to provide in the future reference chemical compositions for any cellular type or compartment. Application of the STIM-PIXE-RBS methodology to the study of nuclear toxicology of cobalt compounds is presented here showing that STIM analysis is absolutely needed when organic mass loss appears during PIXE-RBS irradiation. (author)

  10. Quantitative chemical state analysis of supported vanadium oxide catalysts by high resolution vanadium Kα spectroscopy.

    Science.gov (United States)

    Yamamoto, Takashi; Nanbu, Fumitaka; Tanaka, Tsunehiro; Kawai, Jun

    2011-03-01

    Oxidation states of vanadium species on Al(2)O(3), SiO(2), and TiO(2) were quantitatively analyzed by least-squares fitting of V Kα spectra recorded with a two-crystal X-ray fluorescence spectrometer. Uncertainties of analytical results by the normalization procedure, and coefficient of validation and the reduction behavior of vanadium species by X-ray irradiation were discussed. The V(5+)/V(4+)/V(3+) ratios on Al(2)O(3), SiO(2), and TiO(2) calcined at 773 K in air were determined to be ca. 6/3/1, 3/6/1, and 5/4/1, respectively. The possible chemical states of vanadium species on supports were proposed. PMID:21302919

  11. Quantitative analysis of cell surface membrane proteins using membrane-impermeable chemical probe coupled with 18O labeling

    Science.gov (United States)

    Zhang, Haizhen; Brown, Roslyn N.; Qian, Wei-Jun; Monroe, Matthew E.; Purvine, Samuel O.; Moore, Ronald J.; Gritsenko, Marina A.; Shi, Liang; Romine, Margaret F; Fredrickson, James K.; Paša-Tolić, Ljiljana; Smith, Richard D.; Lipton, Mary S.

    2010-01-01

    We report a mass spectrometry-based strategy for quantitative analysis of cell surface membrane proteome changes. The strategy includes enrichment of surface membrane proteins using a membrane-impermeable chemical probe followed by stable isotope 18O labeling and LC-MS analysis. We applied this strategy for enriching membrane proteins expressed by Shewanella oneidensis MR-1, a gram-negative bacterium with known metal-reduction capability via extracellular electron transfer between outer membrane proteins and extracellular electron receptors. LC/MS/MS analysis resulted in the identification of about 400 proteins with 79% of them being predicted to be membrane localized. Quantitative aspects of the membrane enrichment were shown by peptide level 16O and 18O labeling of proteins from wild-type and mutant cells (generated from deletion of a type II secretion protein, GspD) prior to LC-MS analysis. Using a chemical probe labeled pure protein as an internal standard for normalization, the quantitative data revealed reduced abundances in ΔgspD mutant cells of many outer membrane proteins including the outer membrane c-cype cytochromes OmcA and MtrC, in agreement with previously investigation demonstrating that these proteins are substrates of the type II secretion system. PMID:20380418

  12. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography

    International Nuclear Information System (INIS)

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  13. Chemical fingerprint and quantitative analysis for quality control of polyphenols extracted from pomegranate peel by HPLC.

    Science.gov (United States)

    Li, Jianke; He, Xiaoye; Li, Mengying; Zhao, Wei; Liu, Liu; Kong, Xianghong

    2015-06-01

    A simple and efficient HPLC fingerprint method was developed and validated for quality control of the polyphenols extracted from pomegranate peel (PPPs). Ten batches of pomegranate collected from different orchards in Shaanxi Lintong of China were used to establish the fingerprint. For the fingerprint analysis, 15 characteristic peaks were selected to evaluate the similarities of 10 batches of the PPPs. The similarities of the PPPs samples were all more than 0.968, indicating that the samples from different areas of Lintong were consistent. Additionally, simultaneous quantification of eight monophenols (including gallic acid, punicalagin, catechin, chlorogenic acid, caffeic acid, epicatechin, rutin, and ellagic acid) in the PPPs was conducted to interpret the consistency of the quality test. The results demonstrated that the HPLC fingerprint as a characteristic distinguishing method combining similarity evaluation and quantitative analysis can be successfully used to assess the quality and to identify the authenticity of the PPPs. PMID:25624199

  14. Simultaneous Quantitative and Chemical Fingerprint Analysis of Receptaculum Nelumbinis Based on HPLC-DAD-MS Combined with Chemometrics.

    Science.gov (United States)

    Liu, Haitao; Liu, Jiushi; Zhang, Jin; Qi, Yaodong; Jia, Xiaoguang; Zhang, Bengang; Xiao, Peigen

    2016-04-01

    A rapid and sensitive method based on HPLC-DAD-MS was developed for quantitative analysis of two flavonoids and chemical fingerprint analysis to evaluate the quality of Receptaculum Nelumbinis. The analysis was conducted on a Poroshell 120 C18 column (100 × 4.6 mm, 2.7 μm) with 0.2% formic acid buffer solution and methanol as mobile phases with gradient elution. This method displayed good linearity with R(2) at >0.9999 and limits of quantity <0.37 μg mL(-1). Relative standard deviation values for intra- and interday precision were <0.82 and 1.03%, respectively. The mean recovery of hyperoside was 95.54% and of isoquercitrin was 92.10%. Hyperoside and isoquercitrin were determined simultaneously, and 12 peaks in the chemical fingerprint were identified. The chemometric methods, including similarity analysis, hierarchical clustering analysis and principal component analysis, were applied to distinguish 11 batches of Receptaculum Nelumbinis samples. The above results could validate each other and successfully divide these samples into two groups. Moreover, hyperoside and isoquercitrin could be selected as chemical markers to evaluate the quality of Receptaculum Nelumbinis from different localities. This study demonstrated that the developed method was a powerful and beneficial tool to carry out the quality control of Receptaculum Nelumbinis. PMID:26921895

  15. Quantitative and qualitative analysis of common peaks in chemical fingerprint of Yuanhu Zhitong tablet by HPLC-DAD–MS/MS

    OpenAIRE

    Dao-Quan Tang; Xiao-Xiao Zheng; Xu Chen; Dong-Zhi Yang; Qian Du

    2014-01-01

    A quality control (QC) strategy for quantitative and qualitative analysis of “common peaks” in chemical fingerprint was proposed to analyze Yuanhu Zhitong tablet (YZT), using high performance liquid chromatography with diode array detector and tandem mass spectrometry (HPLC-DAD–MS/MS). The chromatographic separation was achieved on an Agilent Eclipse plus C18 column with a gradient elution using a mixture of 0.4‰ ammonium acetate aqueous (pH 6.0 adjusted with glacial acetic acid) and acetonit...

  16. A study of quantitative chemical state analysis on cerium surface by using auger electron spectroscopy and factor analysis

    International Nuclear Information System (INIS)

    A reaction with oxygen during oxygen exposure to Cerium metal surface under ultra high vacuum condition and depth profiling on formed Cerium oxide layer were investigated in term of chemical state analysis by Auger electron spectroscopy (AES) and by factor analysis. Principal component analysis (PCA) on Ce NON Auger spectra suggested that three physically meaningful components existed from the analyzed data in both cases. After the PCA, three spectra were extracted from the data and these showed significant peak shape changes in each spectrum which were corresponding to different chemical states. In addition, the profiles constructed by factor analysis showed the chemical state changes on the Cerium metal surface during oxidation or chemical depth distributions in the oxide layer. (author)

  17. Quantitative and qualitative analysis of common peaks in chemical fingerprint of Yuanhu Zhitong tablet by HPLC-DAD–MS/MS

    Directory of Open Access Journals (Sweden)

    Dao-Quan Tang

    2014-04-01

    Full Text Available A quality control (QC strategy for quantitative and qualitative analysis of “common peaks” in chemical fingerprint was proposed to analyze Yuanhu Zhitong tablet (YZT, using high performance liquid chromatography with diode array detector and tandem mass spectrometry (HPLC-DAD–MS/MS. The chromatographic separation was achieved on an Agilent Eclipse plus C18 column with a gradient elution using a mixture of 0.4‰ ammonium acetate aqueous (pH 6.0 adjusted with glacial acetic acid and acetonitrile. In chemical fingerprint, 40 peaks were assigned as the “common peaks”. For quantification of “common peaks”, the detection wavelength was set at 254 nm, 270 nm, 280 nm and 345 nm, respectively. The method was validated and good results were obtained to simultaneously determine 10 analytes (protopine, jatrorrhizine, coptisine, palmatine, berberine, xanthotoxin, bergapten, tetrahydropalmatine, imperatorin and isoimperatorin. For qualification of “common peaks”, 33 compounds including 10 quantitative analytes were identified or tentatively characterized using LC–MS/MS. These results demonstrated that the present approach may be a powerful and useful tool to tackle the complex quality issue of YZT.

  18. Quantitative chemical-structure evaluation using atom probe tomography: Short-range order analysis of Fe–Al

    Energy Technology Data Exchange (ETDEWEB)

    Marceau, R.K.W., E-mail: r.marceau@deakin.edu.au [Institute for Frontier Materials, Deakin University, Geelong, VIC 3216 (Australia); Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Straße 1, 40237 Düsseldorf (Germany); Ceguerra, A.V.; Breen, A.J. [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Raabe, D. [Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Straße 1, 40237 Düsseldorf (Germany); Ringer, S.P. [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia)

    2015-10-15

    Short-range-order (SRO) has been quantitatively evaluated in an Fe–18Al (at%) alloy using atom probe tomography (APT) data and by calculation of the generalised multicomponent short-range order (GM-SRO) parameters, which have been determined by shell-based analysis of the three-dimensional atomic positions. The accuracy of this method with respect to limited detector efficiency and spatial resolution is tested against simulated D0{sub 3} ordered data. Whilst there is minimal adverse effect from limited atom probe instrument detector efficiency, the combination of this with imperfect spatial resolution has the effect of making the data appear more randomised. The value of lattice rectification of the experimental APT data prior to GM-SRO analysis is demonstrated through improved information sensitivity. - Highlights: • Short-range-order (SRO) is quantitatively evaluated using atom probe tomography data. • Chemical species-specific SRO parameters have been calculated. • The accuracy of this method is tested against simulated D0{sub 3} ordered data. • Imperfect spatial resolution combined with finite detector efficiency causes a randomising effect. • Lattice rectification of the data prior to GM-SRO analysis is demonstrated to improve information sensitivity.

  19. Quantitative chemical-structure evaluation using atom probe tomography: Short-range order analysis of Fe–Al

    International Nuclear Information System (INIS)

    Short-range-order (SRO) has been quantitatively evaluated in an Fe–18Al (at%) alloy using atom probe tomography (APT) data and by calculation of the generalised multicomponent short-range order (GM-SRO) parameters, which have been determined by shell-based analysis of the three-dimensional atomic positions. The accuracy of this method with respect to limited detector efficiency and spatial resolution is tested against simulated D03 ordered data. Whilst there is minimal adverse effect from limited atom probe instrument detector efficiency, the combination of this with imperfect spatial resolution has the effect of making the data appear more randomised. The value of lattice rectification of the experimental APT data prior to GM-SRO analysis is demonstrated through improved information sensitivity. - Highlights: • Short-range-order (SRO) is quantitatively evaluated using atom probe tomography data. • Chemical species-specific SRO parameters have been calculated. • The accuracy of this method is tested against simulated D03 ordered data. • Imperfect spatial resolution combined with finite detector efficiency causes a randomising effect. • Lattice rectification of the data prior to GM-SRO analysis is demonstrated to improve information sensitivity

  20. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  1. Chemical fingerprinting and quantitative analysis of a Panax notoginseng preparation using HPLC-UV and HPLC-MS

    Directory of Open Access Journals (Sweden)

    Shao Qing

    2011-02-01

    Full Text Available Abstract Background Xuesaitong (XST injection, consisting of total saponins from Panax notoginseng, was widely used for the treatment of cardio- and cerebro-vascular diseases in China. This study develops a simple and global quality evaluation method for the quality control of XST. Methods High performance liquid chromatography-ultraviolet detection (HPLC-UV was used to identify and quantify the chromatographic fingerprints of the XST injection. Characteristic common peaks were identified using HPLC with photo diode array detection/electrospray ionization tandem mass spectrometry (HPLC-PDA/ESI-MSn. Results Representative fingerprints from ten batches of samples showed 27 'common saponins' all of which were identified and quantified using ten reference saponins. Conclusion Chemical fingerprinting and quantitative analysis identified most of the common saponins for the quality control of P. notoginseng products such as the XST injection.

  2. Using Texas Instruments Emulators as Teaching Tools in Quantitative Chemical Analysis

    Science.gov (United States)

    Young, Vaneica Y.

    2011-01-01

    This technology report alerts upper-division undergraduate chemistry faculty and lecturers to the use of Texas Instruments emulators as virtual graphing calculators. These may be used in multimedia lectures to instruct students on the use of their graphing calculators to obtain solutions to complex chemical problems. (Contains 1 figure.)

  3. Physico-chemical studies of laser-induced plasmas for quantitative analysis of materials in nuclear systems

    International Nuclear Information System (INIS)

    Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique very well suited for analysis in hostile environments particularly in the nuclear industry. Quantitative measurements are frequently performed on liquid or solid samples but in some cases, atypical signal behaviors were observed in the LIBS experiment. To avoid or minimize any impact on measurement accuracy, it is necessary to improve the understanding of these phenomena. In the framework of a three-year PhD thesis, the objective was to study the chemical reactions occurring within laser-generated plasma in a LIBS analysis. Experiments on a model material (pure aluminum sample) highlighted the dynamics of molecular recombination according to different ambient gas. The temporal evolution of Al I atomic emission lines and molecular bands of AlO and AlN were studied. A collisional excitation effect was identified for a peculiar electronic energy level of aluminum in the case of a nitrogen atmosphere. This effect disappeared in air. The aluminum plasma was also imaged during its expansion under the different atmospheres in order to localize the areas in which the molecular recombination process takes place. Spectacular particle projections have been highlighted. (author)

  4. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  5. Quantitative analysis of the relative mutagenicity of five chemical constituents of tobacco smoke in the mouse lymphoma assay.

    Science.gov (United States)

    Guo, Xiaoqing; Heflich, Robert H; Dial, Stacey L; Richter, Patricia A; Moore, Martha M; Mei, Nan

    2016-05-01

    Quantifying health-related biological effects, like genotoxicity, could provide a way of distinguishing between tobacco products. In order to develop tools for using genotoxicty data to quantitatively evaluate the risk of tobacco products, we tested five carcinogens found in cigarette smoke, 4-aminobiphenyl (4-ABP), benzo[a]pyrene (BaP), cadmium (in the form of CdCl2), 2-amino-3,4-dimethyl-3H-imidazo[4,5-f]quinoline (MeIQ) and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK), in the mouse lymphoma assay (MLA). The resulting mutagenicity dose responses were analyzed by various quantitative approaches and their strengths and weaknesses for distinguishing responses in the MLA were evaluated. L5178Y/Tk (+/-) 3.7.2C mouse lymphoma cells were treated with four to seven concentrations of each chemical for 4h. Only CdCl2 produced a positive response without metabolic activation (S9); all five chemicals produced dose-dependent increases in cytotoxicity and mutagenicity with S9. The lowest dose exceeding the global evaluation factor, the benchmark dose producing a 10%, 50%, 100% or 200% increase in the background frequency (BMD10, BMD50, BMD100 and BMD200), the no observed genotoxic effect level (NOGEL), the lowest observed genotoxic effect level (LOGEL) and the mutagenic potency expressed as a mutant frequency per micromole of chemical, were calculated for all the positive responses. All the quantitative metrics had similar rank orders for the agents' ability to induce mutation, from the most to least potent as CdCl2(-S9) > BaP(+S9) > CdCl2(+S9) > MeIQ(+S9) > 4-ABP(+S9) > NNK(+S9). However, the metric values for the different chemical responses (i.e. the ratio of the greatest value to the least value) for the different chemicals ranged from 16-fold (BMD10) to 572-fold (mutagenic potency). These results suggest that data from the MLA are capable of discriminating the mutagenicity of various constituents of cigarette smoke, and that quantitative analyses are available

  6. Quantitative and chemical fingerprint analysis for the quality evaluation of Receptaculum Nelumbinis by RP-HPLC coupled with hierarchical clustering analysis.

    Science.gov (United States)

    Wu, Yan-Bin; Zheng, Li-Jun; Yi, Jun; Wu, Jian-Guo; Chen, Ti-Qiang; Wu, Jin-Zhong

    2013-01-01

    A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD) was developed to evaluate the quality of Receptaculum Nelumbinis (dried receptacle of Nelumbo nucifera) through establishing chromatographic fingerprint and simultaneous determination of five flavonol glycosides, including hyperoside, isoquercitrin, quercetin-3-O-β-d-glucuronide, isorhamnetin-3-O-β-d-galactoside and syringetin-3-O-β-d-glucoside. In quantitative analysis, the five components showed good regression (R > 0.9998) within linear ranges, and their recoveries were in the range of 98.31%-100.32%. In the chromatographic fingerprint, twelve peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to the State Food and Drug Administration (SFDA) requirements. Furthermore, hierarchical cluster analysis (HCA) was also applied to evaluate the variation of chemical components among different sources of Receptaculum Nelumbinis in China. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for Receptaculum Nelumbinis and its related traditional Chinese medicinal preparations. PMID:23337200

  7. Quantitative and Chemical Fingerprint Analysis for the Quality Evaluation of Receptaculum Nelumbinis by RP-HPLC Coupled with Hierarchical Clustering Analysis

    Directory of Open Access Journals (Sweden)

    Jin-Zhong Wu

    2013-01-01

    Full Text Available A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD was developed to evaluate the quality of Receptaculum Nelumbinis (dried receptacle of Nelumbo nucifera through establishing chromatographic fingerprint and simultaneous determination of five flavonol glycosides, including hyperoside, isoquercitrin, quercetin-3-O-β-d-glucuronide, isorhamnetin-3-O-β-d-galactoside and syringetin-3-O-β-d-glucoside. In quantitative analysis, the five components showed good regression (R > 0.9998 within linear ranges, and their recoveries were in the range of 98.31%–100.32%. In the chromatographic fingerprint, twelve peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to the State Food and Drug Administration (SFDA requirements. Furthermore, hierarchical cluster analysis (HCA was also applied to evaluate the variation of chemical components among different sources of Receptaculum Nelumbinis in China. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for Receptaculum Nelumbinis and its related traditional Chinese medicinal preparations.

  8. Quantitative phase analysis and thickness measurement of surface-oxide layers in metal and alloy powders by the chemical-granular method

    Science.gov (United States)

    Bracconi, Pierre; Nyborg, Lars

    1998-05-01

    The principles of the chemical-granular analysis of metal and alloy powders are reviewed and the results are compared with those provided by the spectroscopic analytical techniques XPS, AES and SIMS, including ion etching in their depth-profiling mode, when they are applied to the same materials. Several examples are analysed and it is shown that the chemical-granular method alone can provide the very same information as depth profiling. However, it is averaged over a macroscopic powder sample in contrast to one or a few single particles. Nevertheless, it is the combination of the chemical-granular and depth-profiling analyses that really provides an unparalleled description in quantitative terms of the phase composition and microstructure of either multiphase and/or irregular surface layers resulting from oxidation, precipitation or contamination.

  9. The influence of multivariate analysis methods and target grain size on the accuracy of remote quantitative chemical analysis of rocks using laser induced breakdown spectroscopy

    Science.gov (United States)

    Anderson, Ryan B.; Morris, Richard V.; Clegg, Samuel M.; Bell, James F.; Wiens, Roger C.; Humphries, Seth D.; Mertzman, Stanley A.; Graff, Trevor G.; McInroy, Rhonda

    2011-10-01

    Laser-induced breakdown spectroscopy (LIBS) was used to quantitatively analyze 195 rock slab samples with known bulk chemical compositions, 90 pressed-powder samples derived from a subset of those rocks, and 31 pressed-powder geostandards under conditions that simulate the ChemCam instrument on the Mars Science Laboratory Rover (MSL), Curiosity. The low-volatile (training, validation, and test sets. The LIBS spectra and chemical compositions of the training set were used with three multivariate methods to predict the chemical compositions of the test set. The methods were partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs. Both the full LIBS spectrum and the intensity at five pre-selected spectral channels per major element (feature selection) were used as input data for the multivariate calculations. The training spectra were supplied to the algorithms without averaging ( i.e. five spectra per target) and with averaging ( i.e. all spectra from the same target averaged and treated as one spectrum). In most cases neural networks did not perform better than PLS for our samples. PLS2 without spectral averaging outperformed all other procedures on the basis of lowest quadrature root mean squared error (RMSE) for both the full test set and the igneous rocks test set. The RMSE for PLS2 using the igneous rock slab test set is: 3.07 wt.% SiO 2, 0.87 wt.% TiO 2, 2.36 wt.% Al 2O 3, 2.20 wt.% Fe 2O 3, 0.08 wt.% MnO, 1.74 wt.% MgO, 1.14 wt.% CaO, 0.85 wt.% Na 2O, 0.81 wt.% K 2O. PLS1 with feature selection and averaging had a higher quadrature RMSE than PLS2, but merits further investigation as a method of reducing data volume and computation time and potentially improving prediction accuracy, particularly for samples that differ significantly from the training set. Precision and accuracy were influenced by the ratio of laser beam diameter (˜490 μm) to grain size, with coarse-grained rocks often

  10. Quantitative Chemical Indices of Weathered Igneous Rocks

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A study was conducted to compare the effectiveness of different weathering indices for characterising weathered igneous rocks of Hong Kong. Among eight chemical indices evaluated in this study, the Parker index has been found most suitable for a quantitative description of state of weathering. Based on geochemical results of 174 samples, the index decreases almost linearly with an increasing extent of weathering. The results enable a better understanding of the modification of geotechnical properties of igneous rocks associated with weathering processes.

  11. Simultaneous chemical fingerprint and quantitative analysis of Rhizoma Smilacis Glabrae by accelerated solvent extraction and high-performance liquid chromatography with tandem mass spectrometry.

    Science.gov (United States)

    Dai, Weiquan; Zhao, Weiquan; Gao, Fangyuan; Shen, Jingjing; Lv, Diya; Qi, Yunpeng; Fan, Guorong

    2015-05-01

    Rhizoma Smilacis Glabrae (RSG) is a well-known herbal medicine with the homology of medicine and food. In this study, simultaneous chemical fingerprint and quantitative analysis of the bioactive flavonoid components of RSG were developed using accelerated solvent extraction and high-performance liquid chromatography coupled with ion trap tandem mass spectrometry. The operational parameters of accelerated solvent extraction including extraction solvent, extraction temperature, static extraction time, solid-to-liquid ratio, and extraction cycles were optimized. Hierarchical cluster analysis, similarity analysis, and principal component analysis were performed to evaluate the similarity and variation of the samples collected from several provinces in China. Subsequently, high-performance liquid chromatography fingerprints were established for the discrimination of 16 batches of RSG samples, and the major six flavonoids, namely, toxifolin, neoastilbin, astilbin, neoisoastilbin, isoastilbin, and engeletin were then quantitatively determined. The calibration curves for all the six analytes showed good linearity (r(2) > 0.999), and the limits of detection and quantification were less than 0.10 and 0.27 μg·mL(-1) , respectively. Therefore, the proposed extraction and determination methods were proved to be robust and reliable for the quality control of RSG. PMID:25678068

  12. Quantitative analysis of adenosine using Liquid Chromatography/Atmospheric Pressure Chemical Ionization - tandem Mass Spectrometry (LC/APCI-MS/MS)

    OpenAIRE

    Van Dycke, Annelies; Verstraete, Alain; Pil, Kristof; Raedt, Robrecht; Vonck, Kristl; Boison, Detlev; Boon, Paul

    2010-01-01

    Adenosine-secreting cellular brain implants constitute a promising therapeutic approach for the treatment of epilepsy. To engineer neural stem cells for therapeutic adenosine delivery, a reliable and fast analytical method is necessary to quantify cell-based adenosine release. Here we describe the development, optimization and validation of adenosine measurement using liquid chromatography – atmospheric pressure chemical ionization – tandem mass spectrometry (LC-APCI-MS/MS). LC-MS/MS in posit...

  13. Raman scattering quantitative analysis of the anion chemical composition in kesterite Cu2ZnSn(SxSe1−x)4 solid solutions

    International Nuclear Information System (INIS)

    Highlights: • An optical method for the quantitative measurement of [S]/([S] + [Se]) in CZTSSe is presented. • It is based on Raman spectroscopy and covers whole S–Se range of compositions. • The proposed method is independent of crystal quality, experimental conditions and type of material. • The validity of the technique is proven by comparison with independent composition measurements (XRD and EQE). • Test of the method on the data published in the literature has given satisfactory results. - Abstract: A simple and non destructive optical methodology for the quantitative measurement of [S]/([S] + [Se]) anion composition in kesterite Cu2ZnSn(SxSe1−x)4 (CZTSSe) solid solutions by means of Raman spectroscopy in the whole S–Se range of compositions has been developed. This methodology is based on the dependence of the integral intensity ratio of Raman bands sensitive to anion vibrations with the [S]/([S] + [Se]) composition of the kesterite solid solutions. The calibration of the parameters used in this analysis involved the synthesis of a set of CZTSSe powders by solid state reaction method, spanning the range from pure Cu2ZnSnS4 to pure Cu2ZnSnSe4. The validity of the methodology has been tested on different sets of independent samples, including also non-stoichiometric device grade CZTSSe layers with different compositions and films that were synthesized by solution based processes with different crystalline quality. In all cases, the comparison of the results obtained from the analysis of the intensity of the Raman bands with independent composition measurements performed by different techniques as X-ray diffraction and external quantum efficiency has confirmed the satisfactory performance of the developed methodology for the quantitative analysis of these compounds, independently on the crystal quality or the method of synthesis. Further strong support on the methodology performance has been obtained from the analysis of a wider range of samples

  14. Quantitative analysis of methadone in biological fluids using deuterium-labeled methadone and GLC-chemical-ionization mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Hackey, D.L. (Argonne National Lab., IL); Kreek, M.J.; Mattson, D.H.

    1977-11-01

    The (+)-, (-)-, and (+-)-/sup 2/H/sub 5/-methadones, which contained five deuterium atoms in one aromatic ring, were synthesized for use in clinical pharmacological studies and as internal standards. GLC--chemical-ionization mass spectrometry was used to determine plasma and urinary methadone levels by an inverse isotope dilution assay. Plasma drug levels could be determined to 10 pmoles/ml, and urine levels could be measured to 5 pmoles/ml. Plasma methadone levels were examined in several patients undergoing methadone maintenance therapy. These levels generally ranged between 100 and 400 ng/ml (320 to 1300 pmoles/ml) after an average oral dose of 1 mg/kg/day. The methadone half-life was 28.8 +- 4.8 hr.

  15. Qualitative and quantitative analysis of chemical constituents of Centipeda minima by HPLC-QTOF-MS & HPLC-DAD.

    Science.gov (United States)

    Chan, Chi-On; Jin, Deng-Ping; Dong, Nai-Ping; Chen, Si-Bao; Mok, Daniel Kam Wah

    2016-06-01

    A high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (HPLC-QTOF-MS) method in both positive and negative ion modes was established to investigate the major constituents in the ethanolic extract of Centipeda minima (EBSC). Twelve common components including flavones and their glycosides, phenolic and polyphenolic acids, and sesquiterpene lactone were identified in ten batches of samples based on comparison with the retention time and accurate mass of external standards (mass accuracy within 3ppm) or the fragmentation patterns of tandem MS. Meanwhile, a simple, accurate and reliable HPLC-DAD method was also developed to determine the content of 10 chemical markers simultaneously. Results obtained from method validations including linearity, accuracy and precision showed that this new method is reliable and robust. Isochlorogenic acid A and brevilin A were found to be the most abundant in the ethanol extract of EBSC and could be served as markers for quality control of EBSC. PMID:27131150

  16. Quantitative analysis of cadmium(II) and copper(II) by chemical stripping chronopotentiometry using dissolved oxygen as an oxidant

    International Nuclear Information System (INIS)

    Chemical stripping chronopotentiometry was applied to determine cadmium(II) and copper(II) by using oxygen as an oxidant. The calibration curve for cadmium(II) was linear within a range of (10-6--10-4) mol dm-3, while the calibration curve for copper(II) was distorted, since copper(II) ion in the sample solution also worked as an oxidant. The calibration curve for cadmium(II) in the presence of constant concentration of copper(II) ion was linear within the range of (10-5--2 x 10-4) mol dm-3. In order to determine copper(II) in the presence of cadmium(II), it was necessary to electrodeposit only copper by reducing at -0.5 V vs. SCE. The instrumentation used in this work was composed of only a simple voltage supply circuit, a stirrer, a y-t recorder and a pH-meter used as a high-impedance potentiometer. (author)

  17. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  18. Quantitative visualization of the chemical reacting JET

    International Nuclear Information System (INIS)

    The sodium-water reaction should be precisely evaluated for the safety analysis of the sodium-cooled nuclear power plant. To evaluate these chemical reacting jet, the characteristics of the reaction and the mole fraction distributions of the reacting material should be known. In this study, to evaluate the basic characteristics, two fluid jet and chemical reacting jet was measured by the PIV and DELIF. The new dye pair for the dual emission LIF technique was proposed to measure the pH distribution. The Quinine for pH sensitive dye with blue emission and Rhodamine 6G for non-sensitive dye with orange emission, were excited by the third harmonic of Nd:YAG laser (355nm). The high accurate measurement could be achieved for the range of pH 4.0 to 5.5. The ammonia jet into acetic acid was measured using the proposed dye. The effectiveness of the present method was demonstrated. (author)

  19. Chemical Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...

  20. Quantitative mineral salt evaluation in the calcaneous bone using computed tomography, 125I-photon absorption and chemical analysis to compare the value of the individual methods

    International Nuclear Information System (INIS)

    It was the aim of the study described here to verify the accuracy of two different methods for the quantitative evaluation of mineral salts, which were the 125I-photon absorption technique on the one hand and wholebody CT on the other hand. For this purpose, post-mortem examinations of 31 calcaneous bones were carried out to evaluate their individual mineral salt contents in vitro using either of the above-mentioned methods. The results obtained were subsequently contrasted with calcium concentrations determined by chemical analysis. A comparison of the individual mineral salt evaluations with the results from calcium analyses pointed to a highly significant correlation (p=0.001) for both methods under investigation. The same held for the correlation of findings from CT and the 125I-hydroxylapatite technique, where the level of significance was also p=0.001. The above statements must, however, be modified in as much as the mineral salt values measured by CT were consistently lower than those obtained on the basis of 125I-photon absorption. These deviations are chiefly attributable to the fact that the values provided by CT are more susceptible to influences from the fat contained in the bones. In 125I-photon absorption a special formula may be derived to allow for the bias occurring here, provided that the composition of the bone is known. To summarise, the relative advantages and drawbacks of CT and 125I-photon absorption are carefully balanced. Mineral salt evaluations by CT permit incipient losses to be ascertained even in the trunk. The 125I-photon absorption technique would appear to be the obvious method for any kind of follow-up examination in the peripheral skeleton, as it is easily reproducible and radiation exposure can be kept to minimum. (TRV)

  1. The uses and abuses of semi-quantitative analysis

    International Nuclear Information System (INIS)

    Full text: Semi-quantitative XRF analysis provides a fast, efficient analytical service necessary for quick overviews of chemical compositions. This analysis is especially timely when there are no relevant calibrations or standards available and analytical solutions are required. However, as with most matters analytical, semi-quantitative analysis has a tendency to become gospel: therefore, recognising the pitfalls of such analysis is imperative and must be imparted to the customer. Copyright (2002) Australian X-ray Analytical Association Inc

  2. Chemical exchange program analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Waffelaert, Pascale

    2007-09-01

    As part of its EMS, Sandia performs an annual environmental aspects/impacts analysis. The purpose of this analysis is to identify the environmental aspects associated with Sandia's activities, products, and services and the potential environmental impacts associated with those aspects. Division and environmental programs established objectives and targets based on the environmental aspects associated with their operations. In 2007 the most significant aspect identified was Hazardous Materials (Use and Storage). The objective for Hazardous Materials (Use and Storage) was to improve chemical handling, storage, and on-site movement of hazardous materials. One of the targets supporting this objective was to develop an effective chemical exchange program, making a business case for it in FY07, and fully implementing a comprehensive chemical exchange program in FY08. A Chemical Exchange Program (CEP) team was formed to implement this target. The team consists of representatives from the Chemical Information System (CIS), Pollution Prevention (P2), the HWMF, Procurement and the Environmental Management System (EMS). The CEP Team performed benchmarking and conducted a life-cycle analysis of the current management of chemicals at SNL/NM and compared it to Chemical Exchange alternatives. Those alternatives are as follows: (1) Revive the 'Virtual' Chemical Exchange Program; (2) Re-implement a 'Physical' Chemical Exchange Program using a Chemical Information System; and (3) Transition to a Chemical Management Services System. The analysis and benchmarking study shows that the present management of chemicals at SNL/NM is significantly disjointed and a life-cycle or 'Cradle-to-Grave' approach to chemical management is needed. This approach must consider the purchasing and maintenance costs as well as the cost of ultimate disposal of the chemicals and materials. A chemical exchange is needed as a mechanism to re-apply chemicals on site. This

  3. Optimization of quantitative infrared analysis

    Science.gov (United States)

    Duerst, Richard W.; Breneman, W. E.; Dittmar, Rebecca M.; Drugge, Richard E.; Gagnon, Jim E.; Pranis, Robert A.; Spicer, Colleen K.; Stebbings, William L.; Westberg, J. W.; Duerst, Marilyn D.

    1994-01-01

    A number of industrial processes, especially quality assurance procedures, accept information on relative quantities of components in mixtures, whenever absolute values for the quantitative analysis are unavailable. These relative quantities may be determined from infrared intensity ratios even though known standards are unavailable. Repeatability [vs precisionhl in quantitative analysis is a critical parameter for meaningful results. In any given analysis, multiple runs provide "answers" with a certain standard deviation. Obviously, the lower the standard deviation, the better the precision. In attempting to minimize the standard deviation and thus improve precision, we need to delineate which contributing factors we have control over (such as sample preparation techniques, data analysis methodology) and which factors we have little control over (environmental and instrument noise, for example). For a given set of conditions, the best instrumental precision achievable on an IR instrument should be determinable. Traditionally, the term "signal-to-noise" (S/N) has been used for a single spectrum, realizing that S/N improves with an increase in number of scans coadded for generation of that single spectrum. However, the S/N ratio does not directly reflect the precision achievable for an absorbing band. We prefer to use the phrase "maximum achievable instrument precision" (MAIP), which is equivalent to the minimum relative standard deviation for a given peak (either height or area) in spectra. For a specific analysis, the analyst should have in mind the desired precision. Only if the desired precision is less than the MA1P will the analysis be feasible. Once the MAIP is established, other experimental procedures may be modified to improve the analytical precision, if it is below that which is expected (the MAIP).

  4. Automated quantitative analysis for pneumoconiosis

    Science.gov (United States)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  5. A quantitative fitness analysis workflow.

    Science.gov (United States)

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  6. Global identification and quantitative analysis of chemical constituents in traditional Chinese medicinal formula Qi-Fu-Yin by ultra-high performance liquid chromatography coupled with mass spectrometry.

    Science.gov (United States)

    Li, Meng-Ning; Dong, Xin; Gao, Wen; Liu, Xin-Guang; Wang, Rui; Li, Ping; Yang, Hua

    2015-10-10

    Qi-Fu-Yin (QFY), a classical traditional Chinese medicine formula, is proven to have significant neuroprotective effects by modern pharmacological studies. However, the chemical constituents of QFY have not been fully explored. In this study, an ultra-high performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry (UHPLC-QTOF MS) was developed for comprehensive analysis of chemical constituents in QFY. By using characteristic ions and fragmentation rules, a reliable identification of 156 compounds was described here, including 69 triterpene saponins, 23 oligosaccharide esters, 22 flavanoids, 9 alkaloids, 9 phenolic acids, 8 phthalides, 7 phenylethanoid glycosides, 3 xanthones, 3 sesquiterpene lactones, 2 ionones and 1 iridoid glycoside. Twenty-six major compounds were then determined in a single run by UHPLC coupled with triple quadrupole tandem mass spectrometry (QQQ MS) with fast positive/negative polarity switching. It allows for the acquisition of MS data in both ionization modes from a single run. The proposed method was then validated in terms of linearity, accuracy, precision and recovery. The overall recoveries for 26 analytes ranged from 91.35% to 109.58%, with RSDs ranging from 0.82% to 4.83%. In addition, the content of 26 analytes in QFY prepared by five batches of herbal materials was also analyzed. These results demonstrated that our present method was effective and reliable for comprehensive quality evaluation of QFY. Meanwhile, the study might provide the chemical evidence for revealing the material basis of its therapeutic effects. PMID:26112926

  7. Quantitative analysis of endogenous compounds.

    Science.gov (United States)

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-01

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses. PMID

  8. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  9. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  10. A traceable quantitative infrared spectral database of chemical agents

    Science.gov (United States)

    Samuels, Alan C.; Williams, Barry R.; Ben-David, Avishai; Hulet, Melissa; Roelant, Geoffrey J.; Miles, Ronald W., Jr.; Green, Norman; Zhu, Changjiang

    2004-12-01

    Recent experimental field trials have demonstrated the ability of both Fourier transform infrared (FTIR) and active light detection and ranging (LIDAR) sensors to detect particulate matter, including simulants for biological materials. Both systems require a reliable, validated, quantitative database of the mid infrared spectra of the targeted threat agents. While several databases are available, none are validated and traceable to primary standards for reference quality reliability. Most of the existing chemical agent databases have been developed using a bubbler or syringe-fed vapor generator, and all are fraught with errors and uncertainties as a result. In addition, no quantitative condensed phase data on the low volatility chemicals and biological agents have been reported. We are filling this data gap through the systematic measurement of gas phase chemical agent materials generated using a unique vapor-liquid equilibrium approach that allows the quantitation of the cross-sections using a mass measurement calibrated to primary, National Institutes of Standards and Technology (NIST) standards. In addition, we have developed quantitative methods for the measurement of condensed phase materials in both transmission and diffuse reflectance modes. The latter data are valuable for the development of complex index of refraction data, which is required for both system modeling and algorithm development of both FTIR and LIDAR based sensor systems. We will describe our measurement approach and progress toward compiling the first known comprehensive and validated database of both vapor and condensed phase chemical warfare agents.

  11. Quantitative Analysis of Glaciated Landscapes

    Science.gov (United States)

    Huerta, A. D.

    2005-12-01

    The evolution of glaciated mountains is at the heart of the debate over Late Cenozoic linkages between climate and tectonics. Traditionally, the development of high summit elevations is attributed to tectonic processes. However, much of the high elevation of the Transantarctic Mountains can be attributed solely to uplift in response to glacial erosion (Stern et al., 2005). The Transantarctic Mountains (TAM) provide an unparalleled opportunity to study glacial erosion. The mountain range has experienced glacial conditions since Oligocene time. In the higher and dryer regions of the TAM there is only a thin veneer of ice and snow draping the topography. In these regions landforms that were shaped during earlier climatic conditions are preserved. In fact, both glacial and fluvial landforms dating as far back as 18 Ma are preserved locally. In addition, the TAM are ideal for studying glacial erosion since the range has experienced minimal tectonic uplift since late Oligocene time, thus isolating the erosion signal from any tectonic signal. With the advent of digital data sets and GIS methodologies, quantitative analysis can identify key aspects of glaciated landscape morphology, and thus develop powerful analytical techniques for objective study of glaciation. Inspection of USGS topographic maps of the TAM reveals that mountain tops display an extreme range of glacial modification. For example, in the Mt. Rabot region (83°-84° S), mountain peaks are strongly affected by glaciation; cirque development is advanced with cirque diameters on the range of several kilometers, and cirque confluence has resulted in the formation of ``knife-edge'' arêtes up to 10 km long. In contrast, in the Mt. Murchison area (73°-74° S) cirque development is youthful, and there is minimal development of arêtes. Preliminary work indicates that analysis of DEM's and contour lines can be used to distinguish degree of glaciation. In particular, slope, curvature, and power spectrum analysis

  12. Scale Alpha and Beta of Quantitative Convergence and Chemical Reactivity Analysis in Dual Cholinesterase/Monoamine Oxidase Inhibitors for the Alzheimer Disease Treatment Using Density Functional Theory (DFT

    Directory of Open Access Journals (Sweden)

    Alejandro Morales-Bayuelo

    2013-01-01

    Full Text Available Molecular quantum similarity descriptors and Density Functional Theory (DFT based reactivity descriptors were studied for a series of cholinesterase/monoamine oxidase inhibitors used for the Alzheimer's disease treatment (AD. This theoretical study is expected to shed some light onto some molecular aspects that could contribute to the knowledge of the molecular mechanics behind interactions of these molecules with acetylcholinesterase (AChE and butyrylcholinesterase (BuChE, as well as with monoamine oxidase (MAO A and B. The Topogeometrical Superposition Algorithm to handle flexible molecules (TGSA-Flex alignment method was used to solve the problem of the relative orientation in the quantum similarity (QS field. Using the molecular quantum similarity (MQS field and reactivity descriptors supported in the DFT was possible the quantification of the steric and electrostatic effects through of the Coulomb and Overlap quantitative convergence scales (alpha and beta. In addition, an analysis of reactivity indexes is development, using global and local descriptors, identifying the binding sites and selectivity in the (cholinesterase/monoamine oxidase inhibitors, understanding the retrodonor process, and showing new insight for drugs design in a disease of difficult control as Alzheimer.

  13. Semi-quantitative chemical analysis of hard coatings by Raman micro-spectroscopy: the aluminium chromium nitride system as an example.

    Science.gov (United States)

    Kaindl, R; Sartory, B; Neidhardt, J; Franz, R; Reiter, A; Polcik, P; Tessadri, R; Mitterer, C

    2007-11-01

    A new method for chemical analyses of nitride-based hard coatings is presented. Raman band shifts in the spectra of Al(x)Cr(1-x)N coatings, deposited by physical vapour deposition from Al(x)Cr(1-x) targets with x (T,Al) = 0, 0.25, 0.50, 0.70 and 0.85, are calibrated using compositional data of the coatings derived by elastic recoil detection analysis (ERDA) and electron probe micro-analysis (EPMA). Inserting the composition-dependent Raman shift of a combinatorial acoustic-optic lattice mode into an empirically derived equation allows the determination of Al/Cr ratios of the coating with an accuracy of about +/-2%. Spot, line and area analyses of coated cemented carbide and cold work steel samples by using a computer-controlled, motorized x,y-stage are demonstrated and the most important errors influencing precision and accuracy are discussed. Figure Raman map of a coated cold-work steel sample. PMID:17932660

  14. Chemical fingerprint analysis and quantitative determination of pregnanes from aerial parts of caralluma species using HPLC-UV and identification by LC-ESI-TOF

    Science.gov (United States)

    A HPLC method is developed for the quantitative determination of five pregnane derivatives from aerial parts of Caralluma species and dietary supplements. The method is validated for linearity, repeatability, limits of detection (LOD) and limits of quantification (LOQ). The limits of detection and l...

  15. Radiometric chemical analysis

    International Nuclear Information System (INIS)

    The radiometric method of analysis is noted for its sensitivity and its simplicity in both apparatus and procedure. A few inexpensive radioactive reagents permit the analysis of a wide variety of chemical elements and compounds. Any particular procedure is generally applicable over a very wide range of concentrations. It is potentially an analytical method of great industrial significance. Specific examples of analyses are cited to illustrate the potentialities of ordinary equipment. Apparatus specifically designed for radiometric chemistry may shorten the time required, and increase the precision and accuracy for routine analyses. A sensitive and convenient apparatus for the routine performance of radiometric chemical analysis is a special type of centrifuge which has been used in obtaining the data presented in this paper. The radioactivity of the solution is measured while the centrifuge is spinning. This device has been used as the basis for an automatic analyser for phosphate ion, programmed to follow a sequence of unknown sampling, reagent mixing, centrifugation, counting data presentation, and phosphate replenishment. This analyser can repeatedly measure phosphate-concentration in the range of 5 to 50 ppm with an accuracy of ±5%. (author)

  16. Quantitative genetic activity graphical profiles for use in chemical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Waters, M.D. [Environmental Protection Agency, Washington, DC (United States); Stack, H.F.; Garrett, N.E.; Jackson, M.A. [Environmental Health Research and Testing, Inc., Research Triangle Park, NC (United States)

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  17. Chemical fingerprinting and quantitative constituent analysis of Siwu decoction categorized formulae by UPLC-QTOF/MS/MS and HPLC-DAD

    OpenAIRE

    Su, Shulan; Cui, Wenxia; Zhou, Wei; Duan, Jin-Ao; Shang, Erxin; Yuping TANG

    2013-01-01

    Background Siwu decoction categorized formulae (SWDCF) are widely used for treating gynecological diseases. This study aims to elucidate the differences of bioactive constituents in SWDCF by ultra-high performance liquid chromatography coupled with time-of-flight mass spectrometry (UPLC - QTOF - MS /MS) and HPLC-DAD. Methods An efficient method based on UPLC - QTOF - MS /MS was developed for identifying the chemical profiles of SWDCF. HPLC-DAD method was used for quantifying seven chemical ma...

  18. Quantitative analysis of PET measurements in tumors

    International Nuclear Information System (INIS)

    The positron emission tomograhpy (PET) has been used for the evaluation of the characteristics of various tumors. The role of PET in oncology has been evolved from a pure research tool to a methodology of enormous clinical potential. The unique characteristics of PET imaging make sophisticated quantitation possible. Several quantitative methods, such as standardized uptake values (SUV), simplifield quantitation method, Patlak graphical analysis, and Sokoloff's glucose metabolism measurement, have been used in the field of oncology. However, each quantitative method has limitations of its own. For example, the SUV has been used as a quantitative index of glucose metabolism for tumor classification and monitoring response to treatment, even though it depends on blood glucose level, body configuration of patient, and scanning time. The quantitative methods of PET are reviewed and strategy for implementing these methods are presented

  19. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait. PMID:26080172

  20. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  1. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  2. Raman scattering quantitative analysis of the anion chemical composition in kesterite Cu{sub 2}ZnSn(S{sub x}Se{sub 1−x}){sub 4} solid solutions

    Energy Technology Data Exchange (ETDEWEB)

    Dimitrievska, Mirjana, E-mail: mdimitrievska@irec.cat [Catalonia Institute for Energy Research (IREC), Jardins de les Dones de Negre 1 2pl., 08930 Sant Adrià del Besòs, Barcelona (Spain); Gurieva, Galina [Helmholtz Centre Berlin for Materials and Energy, Department Crystallography, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Xie, Haibing; Carrete, Alex [Catalonia Institute for Energy Research (IREC), Jardins de les Dones de Negre 1 2pl., 08930 Sant Adrià del Besòs, Barcelona (Spain); Cabot, Andreu [Catalonia Institute for Energy Research (IREC), Jardins de les Dones de Negre 1 2pl., 08930 Sant Adrià del Besòs, Barcelona (Spain); Institució Catalana de Recerca i Estudis Avançats – ICREA, Passeig Lluís Companys 23, 08010 Barcelona (Spain); Saucedo, Edgardo [Catalonia Institute for Energy Research (IREC), Jardins de les Dones de Negre 1 2pl., 08930 Sant Adrià del Besòs, Barcelona (Spain); Pérez-Rodríguez, Alejandro [Catalonia Institute for Energy Research (IREC), Jardins de les Dones de Negre 1 2pl., 08930 Sant Adrià del Besòs, Barcelona (Spain); IN" 2UB, Departament d’Electrònica, Universitat de Barcelona, C. Martí i Franquès 1, 08028 Barcelona (Spain); and others

    2015-04-15

    Highlights: • An optical method for the quantitative measurement of [S]/([S] + [Se]) in CZTSSe is presented. • It is based on Raman spectroscopy and covers whole S–Se range of compositions. • The proposed method is independent of crystal quality, experimental conditions and type of material. • The validity of the technique is proven by comparison with independent composition measurements (XRD and EQE). • Test of the method on the data published in the literature has given satisfactory results. - Abstract: A simple and non destructive optical methodology for the quantitative measurement of [S]/([S] + [Se]) anion composition in kesterite Cu{sub 2}ZnSn(S{sub x}Se{sub 1−x}){sub 4} (CZTSSe) solid solutions by means of Raman spectroscopy in the whole S–Se range of compositions has been developed. This methodology is based on the dependence of the integral intensity ratio of Raman bands sensitive to anion vibrations with the [S]/([S] + [Se]) composition of the kesterite solid solutions. The calibration of the parameters used in this analysis involved the synthesis of a set of CZTSSe powders by solid state reaction method, spanning the range from pure Cu{sub 2}ZnSnS{sub 4} to pure Cu{sub 2}ZnSnSe{sub 4}. The validity of the methodology has been tested on different sets of independent samples, including also non-stoichiometric device grade CZTSSe layers with different compositions and films that were synthesized by solution based processes with different crystalline quality. In all cases, the comparison of the results obtained from the analysis of the intensity of the Raman bands with independent composition measurements performed by different techniques as X-ray diffraction and external quantum efficiency has confirmed the satisfactory performance of the developed methodology for the quantitative analysis of these compounds, independently on the crystal quality or the method of synthesis. Further strong support on the methodology performance has been

  3. A quantitative assessment of chemical perturbations in thermotropic cyanobiphenyls.

    Science.gov (United States)

    Guerra, Sebastiano; Dutronc, Thibault; Terazzi, Emmanuel; Guénée, Laure; Piguet, Claude

    2016-05-25

    Chemical programming of the temperature domains of existence of liquid crystals is greatly desired by both academic workers and industrial partners. This contribution proposes to combine empirical approaches, which rely on systematic chemical substitutions of mesogenic molecules followed by thermal characterizations, with a rational thermodynamic assessment of the effects induced by chemical perturbations. Taking into account the similarities which exist between temperature-dependent cohesive Gibbs free energy densities (CFEDs) and pressure-temperature phase diagrams modeled with the Clapeyron equation, chemical perturbations are considered as pressure increments along phase boundaries, which control the thermotropic liquid crystalline properties. Taking the familiar calamitic amphiphilic cyanobiphenyl-type mesogens as models, the consequences of (i) methyl substitution of the aromatic polar heads and (ii) connections of bulky silyl groups at the termini of the apolar flexible alkyl chain on the melting and clearing temperatures are quantitatively analyzed. Particular efforts were focused on the translation of the thermodynamic rationalization into a predictive tool accessible to synthetic chemists mainly interested in designing liquid crystals with specific technological applications. PMID:27173940

  4. Quantitative Analysis in Multimodality Molecular Imaging

    International Nuclear Information System (INIS)

    PET offers the possibility of truly quantitative (physiological) measurements of tracer concentration in vivo. However, there are several issues limiting both visual qualitative interpretation and quantitative analysis capabilities of reconstructed PET images that must be considered in order to fully realize this potential. The major challenges to quantitative PET can be categorized in 5 classes: (i) factors related to imaging system performance and data acquisition protocols (instrumentation and measurement factors), (ii) those related to the physics of photon interaction with biologic tissues (physical factors), (iii) image reconstruction (reconstruction factors), (iv) factors related to patient motion and other physiological issues (physiological factors), and (v) Methodological factors: issues related to difficulties in developing accurate tracer kinetic models, especially at the voxel level. This paper reflects the tremendous increase in interest in quantitative molecular imaging using PET as both clinical and research imaging modality in the past decade. It offers an overview of the entire range of quantitative PET imaging from basic principles to various steps required for obtaining quantitatively accurate data from dedicated standalone PET and combined PET/CT and PET/MR systems including data collection methods and algorithms used to correct for physical degrading factors as well as image processing and analysis techniques and their clinical and research applications. Impact of physical degrading factors including attenuation of photons and contribution from photons scattered in the patient and partial volume effect on the diagnostic quality and quantitative accuracy of PET data will be discussed. Considerable advances have been made and much worthwhile research focused on the development of quantitative imaging protocols incorporating accurate data correction techniques and sophisticated image reconstruction algorithms. The fundamental concepts of

  5. Quantitative analysis of learning object repositories

    OpenAIRE

    Ochoa, Xavier; Duval, Erik

    2008-01-01

    This paper conducts the first detailed quantitative study of the process of publication of learning objects in repositories. This process has been often discussed theoretically, but never empirically evaluated. Several question related to basic characteristics of the publication process are raised at the beginning of the paper and answered through quantitative analysis. To provide a wide view of the publication process, this paper analyzes four types of repositories: Learning Object Repositor...

  6. Quantitative analysis of learning object repositories

    OpenAIRE

    Ochoa X.; Duval E.

    2009-01-01

    This paper conducts the first detailed quantitative study of the process of publication of learning objects in repositories. This process has been often discussed theoretically, but never empirically evaluated. Several question related to basic characteristics of the publication process are raised at the beginning of the paper and answered through quantitative analysis. To provide a wide view of the publication process, this paper analyzes four types of repositories: Learning Object Repositor...

  7. Editorial: quantitative analysis of neuroanatomy

    OpenAIRE

    Julian M L Budd; Cuntz, Hermann; Eglen, Stephen J.; Krieger, Patrik

    2015-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. Large-scale projects were recently launched with the aim of providing infrastructure for brain simulations. These projects will increase the need for a precise understanding of brain structure, e.g., through statistical analysis and models. From articles in this Research Topic, we identify three main theme...

  8. Quantitative histogram analysis of images

    Science.gov (United States)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  9. Simultaneous Qualitative and Quantitative Analysis of Multiple Chemical Constituents in YiQiFuMai Injection by Ultra-Fast Liquid Chromatography Coupled with Ion Trap Time-of-Flight Mass Spectrometry.

    Science.gov (United States)

    Liu, Chunhua; Ju, Aichun; Zhou, Dazheng; Li, Dekun; Kou, Junping; Yu, Boyang; Qi, Jin

    2016-01-01

    YiQiFuMai injection (YQFM) is a modern lyophilized powder preparation derived from the traditional Chinese medicine Sheng-mai san (SMS) used for treating cardiovascular diseases, such as chronic heart failure. However, its chemical composition has not been fully elucidated, particularly for the preparation derived from Ophiopogon japonicus. This study aimed to establish a systematic and reliable method to quickly and simultaneously analyze the chemical constituents in YQFM by ultra-fast liquid chromatography coupled with ion trap time-of-flight mass spectrometry (UFLC-IT-TOF/MS). Sixty-five compounds in YQFM were tentatively identified by comparison with reference substances or literature data. Furthermore, twenty-one compounds, including three ophiopogonins, fifteen ginsenosides and three lignans were quantified by UFLC-IT-TOF/MS. Notably, this is the first determination of steroidal saponins from O. japonicus in YQFM. The relative standard deviations (RSDs) of intra- and inter-day precision, reproducibility and stability were <4.9% and all analytes showed good linearity (R² ≥ 0.9952) and acceptable recovery of 91.8%-104.2% (RSD ≤ 5.4%), indicating that the methods were reliable. These methods were successfully applied to quantitative analysis of ten batches of YQFM. The developed approach can provide useful and comprehensive information for quality control, further mechanistic studies in vivo and clinical application of YQFM. PMID:27213307

  10. Simultaneous Qualitative and Quantitative Analysis of Multiple Chemical Constituents in YiQiFuMai Injection by Ultra-Fast Liquid Chromatography Coupled with Ion Trap Time-of-Flight Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Chunhua Liu

    2016-05-01

    Full Text Available YiQiFuMai injection (YQFM is a modern lyophilized powder preparation derived from the traditional Chinese medicine Sheng-mai san (SMS used for treating cardiovascular diseases, such as chronic heart failure. However, its chemical composition has not been fully elucidated, particularly for the preparation derived from Ophiopogon japonicus. This study aimed to establish a systematic and reliable method to quickly and simultaneously analyze the chemical constituents in YQFM by ultra-fast liquid chromatography coupled with ion trap time-of-flight mass spectrometry (UFLC-IT-TOF/MS. Sixty-five compounds in YQFM were tentatively identified by comparison with reference substances or literature data. Furthermore, twenty-one compounds, including three ophiopogonins, fifteen ginsenosides and three lignans were quantified by UFLC-IT-TOF/MS. Notably, this is the first determination of steroidal saponins from O. japonicus in YQFM. The relative standard deviations (RSDs of intra- and inter-day precision, reproducibility and stability were <4.9% and all analytes showed good linearity (R2 ≥ 0.9952 and acceptable recovery of 91.8%–104.2% (RSD ≤ 5.4%, indicating that the methods were reliable. These methods were successfully applied to quantitative analysis of ten batches of YQFM. The developed approach can provide useful and comprehensive information for quality control, further mechanistic studies in vivo and clinical application of YQFM.

  11. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  12. Quantitative surface analysis without reference samples

    International Nuclear Information System (INIS)

    X-ray photoelectron spectroscopy and X-ray induced Auger electron spectroscopy can be applied in quantitative analysis without reference samples. The theory is presented in comparison to X-ray fluorescence analysis. The simplicity of evaluation of measured data is shown and practical examples are given. (Author)

  13. Qualitative and quantitative spectro-chemical analysis of dates using UV-pulsed laser induced breakdown spectroscopy and inductively coupled plasma mass spectrometry.

    Science.gov (United States)

    Mehder, A O; Habibullah, Y B; Gondal, M A; Baig, Umair

    2016-08-01

    Laser Induced Breakdown Spectroscopy (LIBS) is demonstrated for the spectral analysis of nutritional and toxic elements present in several varieties of date fruit samples available in the Saudi Arabia market. The method analyzes the optical emission of a test sample when subjected to pulsed laser ablation. In this demonstration, our primary focus is on calcium (Ca) and magnesium (Mg), as nutritional elements, and on chromium (Cr), as a toxic element. The local thermodynamic equilibrium (LTE) condition was confirmed prior to the elemental characterization of date samples to ensure accuracy of the LIBS analysis. This was achieved by measuring parameters associated with the plasma, such as the electron temperature and the electron number density. These plasma parameters aid interpretation of processes such as ionization, dissociation, and excitation occurring in the plasma plume formed by ablating the date palm sample. The minimum detection limit was established from calibration curves that involved plotting the LIBS signal intensity as a function of standard date samples with known concentrations. The concentration of Ca and Mg detected in different varieties of date samples was between 187 and 515 and 35-196mgL(-1) respectively, while Cr concentration measured between 1.72 and 7.76mgL(-1). In order to optimize our LIBS system, we have studied how the LIBS signal intensity depends on the incident laser energy and the delay time. In order to validate our LIBS analysis results, standard techniques such as inductively coupled plasma mass spectrometry (ICP-MS) were also applied on an identical (duplicate) date samples as those used for the LIBS analysis. The LIBS results exhibit remarkable agreement with those obtained from the ICP-MS analysis. In addition, the finger print wavelengths of other elements present in date samples were also identified and are reported here, which has not been previously reported, to the best of our knowledge. PMID:27216665

  14. ARAMIS体系在化工品罐区定量风险分析中的应用%Application of ARAMIS in quantitative risk analysis of chemicals tank field

    Institute of Scientific and Technical Information of China (English)

    张新梅; 曾岳梅; 陈晨

    2013-01-01

    以某化工品罐区为研究对象,引入欧盟ARAMIS体系对该罐区进行系统风险的定量分析.在确定分析对象的设备类型及可能形成的关键事件的基础上,运用bow-tie图计算得到关键事件的概率;利用风险矩阵对该罐区可能形成的火灾事故进行事故场景的选择;以ARAMIS体系中脆弱度分析模型为基础分析该罐区中受体的脆弱度,进而得到特定事故情景下的定量风险分析结果.结果表明,该体系在事故风险表征及风险扩展分析方面具有较好的独到性,能够为化工区风险评估及优化提供相关技术支持.%ARAMIS(accidental risk assessment methodology for industries system) proposed in European Commission was introduced into the quantitative analysis of a chemical tank field.Based on the selection of equipment type and referred critical event,the probability of critical event was calculated by the establishment of bow-tie figure.Then accident scenario of fire of the tank field was determined according to the risk matrix which is coexisted by the probability and consequence of the accident.The target vulnerability of the plant surroundings was achieved by the models of the ARAMIS and the quantitative risk analysis results of specific accident scenarios were given.The results indicate that ARAMIS reveals good performance in the analysis of risk characteristic and risk expansion and it can provide the technique support for the risk assessment and optimization in chemical industry zones.

  15. Quantitative Reconstructions of 3D Chemical Nanostructures in Nanowires.

    Science.gov (United States)

    Rueda-Fonseca, P; Robin, E; Bellet-Amalric, E; Lopez-Haro, M; Den Hertog, M; Genuist, Y; André, R; Artioli, A; Tatarenko, S; Ferrand, D; Cibert, J

    2016-03-01

    Energy dispersive X-ray spectrometry is used to extract a quantitative 3D composition profile of heterostructured nanowires. The analysis of hypermaps recorded along a limited number of projections, with a preliminary calibration of the signal associated with each element, is compared to the intensity profiles calculated for a model structure with successive shells of circular, elliptic, or faceted cross sections. This discrete tomographic technique is applied to II-VI nanowires grown by molecular beam epitaxy, incorporating ZnTe and CdTe and their alloys with Mn and Mg, with typical size down to a few nanometers and Mn or Mg content as low as 10%. PMID:26837636

  16. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author)

  17. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  18. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  19. Quantitative Proteomics Analysis of Leukemia Cells.

    Science.gov (United States)

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  20. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  1. Localized Quantitative Characterization of Chemical Functionalization Effects on Adhesion Properties of SWNT

    International Nuclear Information System (INIS)

    Chemical modification of single-walled carbon nano tubes (SWNT) has been found to be an excellent method to promote SWNT dispersion, and possibly to improve interaction with matrices via covalent bonding. It is thus a quite promising technique to enhance the mechanical properties of SWNT-reinforced nano composites. However, the underlying mechanism of SWNT chemical functionalization effects on interfacial strength is not quantitatively understood, limiting their usefulness in the design of nano composites. In this work, an atomic force microscopy (AFM-) based adhesive force mapping technique combined with a statistical analysis method were developed and implemented to study adhesive interactions of small SWNT bundles functionalized by amino, epoxide, and hydroperoxide groups as compared to SDS-treated SWNT in controlled environment. Finally, the importance of such localized quantitative measurements in SWNT-reinforced nano composites design and fabrication was also discussed.

  2. A comparative quantitative analysis of the IDEAL (iterative decomposition of water and fat with echo asymmetry and least-squares estimation) and the CHESS (chemical shift selection suppression) techniques in 3.0 T L-spine MRI

    Science.gov (United States)

    Kim, Eng-Chan; Cho, Jae-Hwan; Kim, Min-Hye; Kim, Ki-Hong; Choi, Cheon-Woong; Seok, Jong-min; Na, Kil-Ju; Han, Man-Seok

    2013-03-01

    This study was conducted on 20 patients who had undergone pedicle screw fixation between March and December 2010 to quantitatively compare a conventional fat suppression technique, CHESS (chemical shift selection suppression), and a new technique, IDEAL (iterative decomposition of water and fat with echo asymmetry and least squares estimation). The general efficacy and usefulness of the IDEAL technique was also evaluated. Fat-suppressed transverse-relaxation-weighed images and longitudinal-relaxation-weighted images were obtained before and after contrast injection by using these two techniques with a 1.5T MR (magnetic resonance) scanner. The obtained images were analyzed for image distortion, susceptibility artifacts and homogenous fat removal in the target region. The results showed that the image distortion due to the susceptibility artifacts caused by implanted metal was lower in the images obtained using the IDEAL technique compared to those obtained using the CHESS technique. The results of a qualitative analysis also showed that compared to the CHESS technique, fewer susceptibility artifacts and more homogenous fat removal were found in the images obtained using the IDEAL technique in a comparative image evaluation of the axial plane images before and after contrast injection. In summary, compared to the CHESS technique, the IDEAL technique showed a lower occurrence of susceptibility artifacts caused by metal and lower image distortion. In addition, more homogenous fat removal was shown in the IDEAL technique.

  3. Quantitative Proteomic Analysis of the Human Nucleolus.

    Science.gov (United States)

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  4. New software for XRF quantitative analysis

    International Nuclear Information System (INIS)

    It is well known that in XRF quantitative analysis empirical calibrations, even in the most simple case of binary mixtures, a relatively large number of standards are required. In case of samples containing more than 3 elements, the number of standards needed for calibration becomes suddenly prohibitive and the calibration curve has to be obtained by complicated multidimensional fits. In order to overcome this difficulty, a new XRF analysis software has been developed, based exclusively on theoretical treatment of photon interactions in sample. Starting from theoretical formulas of Shiraiwa and Fujino for primary and secondary fluorescence, modified to take into account the finite sample thickness, the total yield for a characteristic line Xj in a sample can be calculated as function of its composition w vector = (w1,...,wn) were {wj} are the concentrations of all n constituent elements. A non-linear system can be written for a given sample with unknown composition. Choosing a number of equations equal to the number of identified elements, we obtain a non-linear system that can be solved numerically by Newton's algorithm. When a light element is known to be present in sample and its lines cannot be seen in spectrum (i.e. Al, C, etc) the completeness equation, Σwi = 1 must be added in the system to take into account the true composition. Based on the algorithm sketched before, a set of computer codes has been written each one being specific to one of the three types of excitation sources usually used in XRF: collimated beam from a X-ray tube, collimated isotopic source and ring-like isotopic source. The ring-source version is completed by a Monte Carlo code for incidence vs. detection angle weight matrix calculation. Also, a version taking into account chemical content for the existing compounds in sample has been written for each type of excitation source. The programs were tested on many samples with known composition and the results were always below 10

  5. Quantitative Remote Laser-Induced Breakdown Spectroscopy by Multivariate Analysis

    Science.gov (United States)

    Clegg, S. M.; Sklute, E. C.; Dyar, M. D.; Barefield, J. E.; Wiens, R. C.

    2007-12-01

    The ChemCam instrument selected for the Mars Science Laboratory (MSL) rover includes a remote Laser- Induced Breakdown Spectrometer (LIBS) that will quantitatively probe samples up to 9m from the rover mast. LIBS is fundamentally an elemental analysis technique. LIBS involves focusing a Nd:YAG laser operating at 1064 nm onto the surface of the sample. The laser ablates material from the surface, generating an expanding plasma containing electronically excited ions, atoms, and small molecules. As these electronically excited species relax back to the ground state, they emit light at wavelengths characteristic of the species present in the sample. Some of this emission is directed into one of three dispersive spectrometers. In this paper, we studied a suite of 18 igneous and highly-metamorphosed samples from a wide variety of parageneses for which chemical analyses by XRF were already available. Rocks were chosen to represent a range of chemical composition from basalt to rhyolite, thus providing significant variations in all of the major element contents (Si, Fe, Al, Ca, Na, K, O, Ti, Mg, and Mn). These samples were probed at a 9m standoff distance under experimental conditions that are similar to ChemCam. Extracting quantitative elemental concentrations from LIBS spectra is complicated by the chemical matrix effects. Conventional methods for obtaining quantitative chemical data from LIBS analyses are compared with new multivariate analysis (MVA) techniques that appear to compensate for these chemical matrix effects. The traditional analyses use specific elemental peak heights or areas, which compared with calibration curves for each element at one or more emission lines for a series of standard samples. Because of matrix effects, the calibration standards generally must have similar chemistries to the unknown samples, and thus this conventional approach imposes severe limitations on application of the technique to remote analyses. In this suite of samples, the use

  6. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry

  7. Macro-quantitative Analysis of Kyoto Mechanism

    OpenAIRE

    Ryoichi Komiyama

    2007-01-01

    The Kyoto Protocol, as adopted at the Third Conference of the Parties to the U.N. Framework Convention on Climate Change (COP3), has introduced the Kyoto Mechanism (including the Emissions Trading, Joint Implementation, and Clean Development Mechanism systems) to promote the efficient achievement of greenhouse gas emission reduction targets. Calls have grown for global efforts to enhance the Kyoto Mechanism. This study represents a quantitative analysis of the Kyoto Mechanism, taking advantag...

  8. Chemical substructure analysis in toxicology

    International Nuclear Information System (INIS)

    A preliminary examination of chemical-substructure analysis (CSA) demonstrates the effective use of the Chemical Abstracts compound connectivity file in conjunction with the bibliographic file for relating chemical structures to biological activity. The importance of considering the role of metabolic intermediates under a variety of conditions is illustrated, suggesting structures that should be examined that may exhibit potential activity. This CSA technique, which utilizes existing large files accessible with online personal computers, is recommended for use as another tool in examining chemicals in drugs. 2 refs., 4 figs

  9. Chemical substructure analysis in toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Beauchamp, R.O. Jr. [Center for Information on Toxicology and Environment, Raleigh, NC (United States)

    1990-12-31

    A preliminary examination of chemical-substructure analysis (CSA) demonstrates the effective use of the Chemical Abstracts compound connectivity file in conjunction with the bibliographic file for relating chemical structures to biological activity. The importance of considering the role of metabolic intermediates under a variety of conditions is illustrated, suggesting structures that should be examined that may exhibit potential activity. This CSA technique, which utilizes existing large files accessible with online personal computers, is recommended for use as another tool in examining chemicals in drugs. 2 refs., 4 figs.

  10. A quantitative strategy to detect changes in accessibility of protein regions to chemical modification on heterodimerization

    Science.gov (United States)

    Dreger, Mathias; Leung, Bo Wah; Brownlee, George G; Deng, Tao

    2009-01-01

    We describe a method for studying quantitative changes in accessibility of surface lysine residues of the PB1 subunit of the influenza RNA polymerase as a result of association with the PA subunit to form a PB1-PA heterodimer. Our method combines two established methods: (i) the chemical modification of surface lysine residues of native proteins by N-hydroxysuccinimidobiotin (NHS-biotin) and (ii) the stable isotope labeling of amino acids in cell culture (SILAC) followed by tryptic digestion and mass spectrometry. By linking the chemical modification with the SILAC methodology for the first time, we obtain quantitative data on chemical modification allowing subtle changes in accessibility to be described. Five regions in the PB1 monomer showed altered reactivity to NHS-biotin when compared with the [PB1-PA] heterodimer. Mutational analysis of residues in two such regions—at K265 and K481 of PB1, which were about three- and twofold, respectively, less accessible to biotinylation in the PB1-PA heterodimer compared with the PB1 monomer, demonstrated that both K265 and K481 were crucial for polymerase function. This novel assay of quantitative profiling of biotinylation patterns (Q-POP assay) highlights likely conformational changes at important functional sites, as observed here for PB1, and may provide information on protein–protein interaction interfaces. The Q-POP assay should be a generally applicable approach and may detect novel functional sites suitable for targeting by drugs. PMID:19517532

  11. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  12. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  13. Quantitative Bias Analysis in Regulatory Settings.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  14. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  15. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  16. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  17. 10031 Executive Summary -- Quantitative Models: Expressiveness and Analysis

    OpenAIRE

    Baier, Christel; Droste, Manfred; Gastin, Paul; Larsen, Kim Guldstrand

    2010-01-01

    Quantitative models and quantitative analysis in Computer Science are currently intensively studied, resulting in a revision of the foundation of Computer Science where classical yes/no answers are replaced by quantitative analyses. The potential application areas are huge, e.g., performance analysis, operational research or embedded systems. The aim of the seminar was to address three fundamental topics which are closely related: quantitative analysis of real-time and h...

  18. Automated quantitative image analysis of nanoparticle assembly

    Science.gov (United States)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  19. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    Science.gov (United States)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical

  20. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  1. Materials characterization through quantitative digital image analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  2. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  3. Quantitative chemical-shift MR imaging cutoff value: Benign versus malignant vertebral compression – Initial experience

    Directory of Open Access Journals (Sweden)

    Dalia Z. Zidan

    2014-09-01

    Conclusion: Quantitative chemical shift MR imaging could be a valuable addition to standard MR imaging techniques and represent a rapid problem solving tool in differentiating benign from malignant vertebral compression, especially in patients with known primary malignancies.

  4. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  5. Quantitative analysis of galaxy-galaxy lensing

    CERN Document Server

    Schneider, P J; Schneider, Peter; Rix, Hans Walter

    1996-01-01

    In this paper we explore a quantitative and efficient method to constrain the halo properties of distant galaxy populations through ``galaxy--galaxy" lensing and show that the mean masses and sizes of halos can be estimated accurately, without excessive data requirements. Specifically, we propose a maximum-likelihood analysis which takes full account of the actual image ellipticities, positions and apparent magnitudes. We apply it to simulated observations, using the same model for the lensing galaxy population as in BBS, where the galaxy halos are described by isothermal spheres with velocity dispersion \\sigma, truncated at a radius s. Both parameters are assumed to scale with the luminosity of the galaxy. The best fitting values are then determined with the maximum-likelihood analysis. We explore two different observing strategies, (a) taking deep images (e.g., with HST) on small fields, and (b) using shallower images on larger fields. We find that \\sigma_* can be determined to \\lesssim10\\% accuracy if a sa...

  6. Advances in quantitative electroencephalogram analysis methods.

    Science.gov (United States)

    Thakor, Nitish V; Tong, Shanbao

    2004-01-01

    Quantitative electroencephalogram (qEEG) plays a significant role in EEG-based clinical diagnosis and studies of brain function. In past decades, various qEEG methods have been extensively studied. This article provides a detailed review of the advances in this field. qEEG methods are generally classified into linear and nonlinear approaches. The traditional qEEG approach is based on spectrum analysis, which hypothesizes that the EEG is a stationary process. EEG signals are nonstationary and nonlinear, especially in some pathological conditions. Various time-frequency representations and time-dependent measures have been proposed to address those transient and irregular events in EEG. With regard to the nonlinearity of EEG, higher order statistics and chaotic measures have been put forward. In characterizing the interactions across the cerebral cortex, an information theory-based measure such as mutual information is applied. To improve the spatial resolution, qEEG analysis has also been combined with medical imaging technology (e.g., CT, MR, and PET). With these advances, qEEG plays a very important role in basic research and clinical studies of brain injury, neurological disorders, epilepsy, sleep studies and consciousness, and brain function. PMID:15255777

  7. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography; Analise quimica quantitativa para a padronizacao do oleo de copaiba por cromatografia em fase gasosa de alta resolucao

    Energy Technology Data Exchange (ETDEWEB)

    Tappin, Marcelo R.R.; Pereira, Jislaine F.G.; Lima, Lucilene A.; Siani, Antonio C. [Farmanguinhos - Inst. de Tecnologia em Farmacos, Rio de Janeiro, RJ (Brazil)]. E-mail: siani@far.fiocruz.br; Mazzei, Jose L. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Escola de Quimica; Ramos, Monica F.S. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Faculdade de Farmacia. Dept. de Medicamentos

    2004-04-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  8. Possibilities of Moessbauer spectroscopy for chemical analysis

    International Nuclear Information System (INIS)

    Full text: The Moessbauer spectroscopy technique belongs to few methods of defining the phase state or crystallographic sites of a substance. The Moessbauer spectra bear information on various hyperfine interactions, many of which are indirectly related to the chemical nature of the Moessbauer atom and its nearest environment. Determination of the parameters of hyperfine interactions that can be extracted from Moessbauer spectra and used for qualitative analysis is a routine task. In the present work, we studied the influence of various factors on experimental errors encountered in quantitatively defining the phase composition or site populations of the substance under study, such as the measurements geometry, Lamb-Moessbauer coefficients, absorber thickness, efficiency and dead time of the detection system and spectral line shape. The absolute f measurements were made using the 'black' absorber method. Moessbauer measurements were carried out with carefully controlled background intensities, since the accuracy of f evaluation directly depends on the measurement of the background. The influence of a non-uniformity of samples on the results of the quantitative analysis is discussed. The data analysis was divided into two parts: removal of instrumental artifacts by folding and baseline correction and deconvolution to extract the hyperfine parameters of individual local environments. In our approach, calibration graphs were drawn by measuring the spectra of a series of analogous samples having different known concentrations. For the same purpose, the internal standard method was also used. Experimental data are presented for phase analyses of different samples. (author)

  9. Quantitative Chemical Imaging with Multiplex Stimulated Raman Scattering Microscopy

    OpenAIRE

    Fu, Dan; Lu, Fake; Zhang, Xu; Freudiger, Christian Wilhelm; Pernik, Douglas R.; Holtom, Gary; Xie, Xiaoliang Sunney

    2012-01-01

    Stimulated Raman scattering (SRS) microscopy is a newly developed label-free chemical imaging technique that overcomes the speed limitation of confocal Raman microscopy while avoiding the nonresonant background problem of coherent anti-Stokes Raman scattering (CARS) microscopy. Previous demonstrations have been limited to single Raman band measurements. We present a novel modulation multiplexing approach that allows real-time detection of multiple species using the fast Fourier transform. ...

  10. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    Directory of Open Access Journals (Sweden)

    Jana Tillack

    2012-11-01

    Full Text Available Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected.

  11. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  12. Quantitative Survey and Structural Classification of Hydraulic Fracturing Chemicals Reported in Unconventional Gas Production.

    Science.gov (United States)

    Elsner, Martin; Hoelzer, Kathrin

    2016-04-01

    Much interest is directed at the chemical structure of hydraulic fracturing (HF) additives in unconventional gas exploitation. To bridge the gap between existing alphabetical disclosures by function/CAS number and emerging scientific contributions on fate and toxicity, we review the structural properties which motivate HF applications, and which determine environmental fate and toxicity. Our quantitative overview relied on voluntary U.S. disclosures evaluated from the FracFocus registry by different sources and on a House of Representatives ("Waxman") list. Out of over 1000 reported substances, classification by chemistry yielded succinct subsets able to illustrate the rationale of their use, and physicochemical properties relevant for environmental fate, toxicity and chemical analysis. While many substances were nontoxic, frequent disclosures also included notorious groundwater contaminants like petroleum hydrocarbons (solvents), precursors of endocrine disruptors like nonylphenols (nonemulsifiers), toxic propargyl alcohol (corrosion inhibitor), tetramethylammonium (clay stabilizer), biocides or strong oxidants. Application of highly oxidizing chemicals, together with occasional disclosures of putative delayed acids and complexing agents (i.e., compounds designed to react in the subsurface) suggests that relevant transformation products may be formed. To adequately investigate such reactions, available information is not sufficient, but instead a full disclosure of HF additives is necessary. PMID:26902161

  13. Quantitative determinations of chemical compounds with nutritional value from Inca crops: Chenopodium quinoa ('quinoa').

    Science.gov (United States)

    González, J A; Roldán, A; Gallardo, M; Escudero, T; Prado, F E

    1989-12-01

    Quantitative determinations of total and soluble proteins, total and free sugars, starch, total lipids, tanins, ash (Ca, Na, K, Fe, and P), and caloric value were carried out on quinoa flour. Results show that the amount of soluble proteins was higher than the standard value for wheat and maize and was very close to that of barley's. The yield of free sugars like glucose (4.55%), fructose (2.41%) and sucrose (2.39%) were also of importance. Iron and calcium levels were higher than the reported values for maize and barley. The same occurred for the caloric value (435.5 Kcal/100 g). The content of saponins was also examined since its effect on red blood cells of group A and O has been related as a potential problem of the Andes population. From the chemical analysis a more complete view about quinoa as human food was presented. PMID:2631089

  14. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  15. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  16. Quantitative analysis of phytosterols in edible oils using APCI liquid chromatography-tandem mass spectrometry

    OpenAIRE

    Mo, Shunyan; Dong, Linlin; Hurst, W Jeffrey; van Breemen, Richard B

    2013-01-01

    Previous methods for the quantitative analysis of phytosterols have usually used GC-MS and require elaborate sample preparation including chemical derivatization. Other common methods such as HPLC with absorbance detection do not provide information regarding the identity of the analytes. To address the need for an assay that utilizes mass selectivity while avoiding derivatization, a quantitative method based on LC-tandem mass spectrometry (LC-MS-MS) was developed and validated for the measur...

  17. Quantitative biometric phenotype analysis in mouse lenses

    OpenAIRE

    Reilly, Matthew A.; Andley, Usha P.

    2010-01-01

    The disrupted morphology of lenses in mouse models for cataracts precludes accurate in vitro assessment of lens growth by weight. To overcome this limitation, we developed morphometric methods to assess defects in eye lens growth and shape in mice expressing the αA-crystallin R49C (αA-R49C) mutation. Our morphometric methods determine quantitative shape and dry weight of the whole lens from histological sections of the lens. This method was then used to quantitatively compare the biometric gr...

  18. Quantitative phase analysis of iron ore concentrates

    Directory of Open Access Journals (Sweden)

    Geraldo Magela da Costa

    2002-10-01

    Full Text Available The quantification of goethite, magnetite, martite and specularite in iron ores was successfully achieved by a combination of wet chemical analysis and x-ray diffraction. It was found that the intensity of the goethite (111 peak is constant for a certain sample provided that the same sample holder is used. Calibration curves with a linear behavior have been derived using the areas of the above mentioned peak and the amounts of goethite obtained by Mössbauer spectroscopy and optical microscopy. In addition, the integral width of the hematite (012 line broadens linearly as the amount of martite increases, thus allowing an estimation of the amounts of martite and specularite.A quantificação de goetita, magnetita, martita e especularita em minérios de ferro foi realizada através de uma combinação de análises químicas e difração de raios X. Observou-se que a intensidade do pico de difração (111 da goetita é constante para uma determinada amostra, se o mesmo porta-amostras é utilizado. Curvas de calibração com um comportamento linear foram obtidas usando-se as áreas do pico (111 e as quantidades de goetita obtidas através da espectroscopia Mössbauer e microscopia ótica. Além disso, a largura integral do pico (012 da hematita aumenta linearmente com o aumento da quantidade de martita, permitindo, assim, uma estimativa dos teores de martita e especularita.

  19. Quantitative Analysis of Radar Returns from Insects

    Science.gov (United States)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  20. Quantitative risk analysis of urban flooding in lowland areas

    OpenAIRE

    J. A. E. ten Veldhuis

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study reported in this thesis reviews existing approaches to quantitative flood risk analysis and evaluation of urban flooding guidelines. It proceeds to explore historical data on flooding incidents from mun...

  1. Automatic quantitative morphological analysis of interacting galaxies

    OpenAIRE

    Shamir, Lior; Holincheck, Anthony; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyz...

  2. A quantitative comparison between electrocoagulation and chemical coagulation for boron removal from boron-containing solution

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, A. Erdem [Atatuerk University, Faculty of Engineering, Department of Environmental Engineering, 25240 Erzurum (Turkey)], E-mail: aerdemy@atauni.edu.tr; Boncukcuoglu, Recep [Atatuerk University, Faculty of Engineering, Department of Environmental Engineering, 25240 Erzurum (Turkey); Kocakerim, M. Muhtar [Atatuerk University, Faculty of Engineering, Department of Chemical Engineering, 25240 Erzurum (Turkey)

    2007-10-22

    This paper provides a quantitative comparison of electrocoagulation and chemical coagulation approaches based on boron removal. Electrocoagulation process delivers the coagulant in situ as the sacrificial anode corrodes, due to a fixed current density, while the simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation. By comparison, conventional chemical coagulation typically adds a salt of the coagulant, with settling providing the primary pollutant removal path. Chemical coagulation was carried out via jar tests using aluminum chloride. Comparison was done with the same amount of coagulant between electrocoagulation and chemical coagulation processes. Boron removal obtained was higher with electrocoagulation process. In addition, it was seen that chemical coagulation has any effect for boron removal from boron-containing solution. At optimum conditions (e.g. pH 8.0 and aluminum dose of 7.45 g/L), boron removal efficiencies for electrocoagulation and chemical coagulation were 94.0% and 24.0%, respectively.

  3. A quantitative comparison between electrocoagulation and chemical coagulation for boron removal from boron-containing solution

    International Nuclear Information System (INIS)

    This paper provides a quantitative comparison of electrocoagulation and chemical coagulation approaches based on boron removal. Electrocoagulation process delivers the coagulant in situ as the sacrificial anode corrodes, due to a fixed current density, while the simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation. By comparison, conventional chemical coagulation typically adds a salt of the coagulant, with settling providing the primary pollutant removal path. Chemical coagulation was carried out via jar tests using aluminum chloride. Comparison was done with the same amount of coagulant between electrocoagulation and chemical coagulation processes. Boron removal obtained was higher with electrocoagulation process. In addition, it was seen that chemical coagulation has any effect for boron removal from boron-containing solution. At optimum conditions (e.g. pH 8.0 and aluminum dose of 7.45 g/L), boron removal efficiencies for electrocoagulation and chemical coagulation were 94.0% and 24.0%, respectively

  4. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  5. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  6. Development of a Quantitative Analysis Program for WDS-SEM

    International Nuclear Information System (INIS)

    We have developed a quantitative analysis program for a WDS-SEM including the function of inspecting the beam stability. This program will be applied to analyze the Xe behavior inside the irradiated nuclear fuel pellet and can be helpful to the uncertainty determination of the quantitative analysis by WDS-SEM. WDS(Wavelength Dispersive Spectrometer) with an electron beam has been applied in various research areas, and is an essential instrument for analyzing the quantity of fission products inside a PWR fuel pellet. WDS installed in an ordinary SEM(Secondary Electron Microscope) is not optimized for a quantitative analysis, differently from EPMA(Electron Probe Micro-Analyzer). Electron beam stability is very important for a quantitative analysis of a WDS-SEM

  7. Porosity determination on pyrocarbon using automatic quantitative image analysis

    International Nuclear Information System (INIS)

    Methods of porosity determination are reviewed and applied to the measurement of the porosity of pyrocarbon. Specifically, the mathematical basis of stereology and the procedures involved in quantitative image analysis are detailed

  8. Quantitative analysis of magnetic resonance time domain signals

    International Nuclear Information System (INIS)

    A magnetic resonance time domain signal is often made up of a limited number of exponentially decaying sinusoids plus white noise. Traditionally, quantitative analysis of the signal is carried out in the frequency domain, after applying FFT in conjunction with a time window. It is shown that quantitative analysis directly in the time domain is feasible, and in fact yields several advantages. Various methods are applied and compared. (Auth.)

  9. Enabling quantitative data analysis through e-infrastructures

    OpenAIRE

    Tan, K.L.L.; P.S. Lambert; Turner, K J; J. Blum; Bowes, A.; Bell, D.N.F.; Gayle, V.; S. B. Jones; Maxwell, M.; Sinnott, R. O.; Warner, G.

    2009-01-01

    This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as ‘data management’, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantita...

  10. Joint association analysis of bivariate quantitative and qualitative traits

    OpenAIRE

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a presp...

  11. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  12. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  13. Quantitative X ray analysis system. User's manual and guide to X ray fluorescence technique

    International Nuclear Information System (INIS)

    This guide covers trimmed and re-arranged version 3.6 of the Quantitative X ray Analysis System (QXAS) software package that includes the most frequently used methods of quantitative analysis. QXAS is a comprehensive quantitative analysis package that has been developed by the IAEA through research and technical contracts. Additional development has also been carried out in the IAEA Laboratories in Seibersdorf where QXAS was extensively tested. New in this version of the manual are the descriptions of the Voigt-profile peak fitting, the backscatter fundamental parameters' and emission-transmission methods of chemical composition analysis, an expanded chapter on the X ray fluorescence physics, and completely revised and increased number of practical examples of utilization of the QXAS software package. The analytical data accompanying this manual were collected in the IAEA Seibersdorf Laboratories in the years 2006/2007

  14. Ozonized oils: a qualitative and quantitative analysis.

    Science.gov (United States)

    Guinesi, Adriana Simionatto; Andolfatto, Carolina; Bonetti Filho, Idomeo; Cardoso, Arnaldo Alves; Passaretti Filho, Juliano; Farac, Roberta Vieira

    2011-01-01

    Most of the problems of endodontic origin have a bacterial etiological agent. Thus, there is a continued interest in seeking more effective chemical substances that can replace the camphorated paramonochiorophenol or antibiotics as intracanal medicaments. Among the possible substances, ozone has some interesting biological characteristics: bactericidal action, debriding effect, angiogenesis stimulation capacity and high oxidizing power. The purpose of this study was to chemically evaluate the presence of ozone in sunflower, castor, olive and almond oil, as well as in propylene glycol and byproducts of ozonation, such as formaldehyde. These compounds were ozonized, inserted into empty and sterile vials, and analyzed by testing the reaction between ozone and indigo, for determining the presence of ozone, and subjected to the chromotropic acid test for determining the presence of formaldehyde. It was observed complete absence of ozone in all samples tested and presence of formaldehyde. The bactericidal and healing action of ozonized oils could be attributed to products formed by the ozonation of mineral oils, such as formaldehyde, not to the ozone itself. PMID:21519646

  15. Quantitative multi-modal NDT data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heideklang, René; Shokouhi, Parisa [Division 8.5, BAM Fed. Institute for Materials Research and Testing, Unter den Eichen 87, 12205 Berlin (Germany)

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  16. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  17. Methods of quantitative fire hazard analysis

    International Nuclear Information System (INIS)

    Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period

  18. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  19. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  20. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  1. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C2H2, C6H6 and so on, qualitative analysis was used with C3H5N, C10H10, C8H8N2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  2. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  3. Quantitive and sociological analysis of blog networks

    CERN Document Server

    Bachnik, W; Leszczynski, P; Podsiadly, R; Rymszewicz, E; Kurylo, L; Makowiecki, D S; Bykowska, B; Bachnik, Wiktor; Szymczyk, Stanislaw; Leszczynski, Piotr; Podsiadlo, Rafal; Rymszewicz, Ewa; Kurylo, Lukasz; Makowiec, Danuta; Bykowska, Beata

    2005-01-01

    This paper examines the emerging phenomenon of blogging, using three different Polish blogging services as the base of the research. Authors show that blog networks are sharing their characteristics with complex networks gamma coefficients, small worlds, cliques, etc.). Elements of sociometric analysis were used to prove existence of some social structures in the blog networks.

  4. A Quantitative Analysis of Countries' Research Strengths

    Science.gov (United States)

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  5. Quantitative analysis of diamond deposition reactor efficiency

    International Nuclear Information System (INIS)

    Graphical abstract: Surface H atom densities in a diamond deposition plasma reactor and the highest predicted value (black line). A 350 μm diamond crystal grown at 70 μm/h. Highlights: ► Electron temperature measurement at high pressure in diamond deposition reactor. ► H-atom density measurements at high pressure and high power in diamond deposition reactor. ► Surface H-atom density measurements at high pressure and high power in diamond deposition reactor. ► Microwave cavity based reactor efficiency compared to others reactors. - Abstract: Optical emission spectroscopy has been used to characterize diamond deposition microwave chemical vapour deposition (MWCVD) plasmas operating at high power density. Electron temperature has been deduced from H atom emission lines while H-atom mole fraction variations have been estimated using actinometry technique, for a wide range of working conditions: pressure 25–400 hPa and MW power 600–4000 W. An increase of the pressure from 14 hPa to 400 hPa with a simultaneous increase in power causes an electron temperature decrease from 17,000 K to 10,000 K and a H atom mole fraction increase from 0.1 to up to 0.6. This last value however must be considered as an upper estimate due to some assumptions made as well as experimental uncertainties.

  6. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    OpenAIRE

    Demyanenko DV; Demyanenko VG; Breusova SV

    2016-01-01

    Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS) causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and...

  7. Policy networks : a citation analysis of the quantitative literature

    OpenAIRE

    Leifeld, Philip

    2007-01-01

    Since the mid-1970s, the quantitative literature on political networks has grown to approximately 200 publications. A number of scholars have recently tried to organize the "Babylonian variety" of different policy network concepts and schools of thought in political network analysis. It will be demonstrated that they fail to grasp the important distinctions between the research specialties, and an empirical assessment of the quantitative literature is offered by analyzing co-citation data and...

  8. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    . We present an algorithm for the translation of such models into Markov Decision processes expressed in the syntax of the PRISM model checker. This enables analysis of business processes for the following properties: transient and steadystate probabilities, the timing, occurrence and ordering of...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  9. A Quantitative Analysis of Countries’ Research Strengths

    OpenAIRE

    Anurag Saxena; S. David Brazer; B M Gupta

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding. Research output from 34 countries is examined using a conceptual framework that emphasizes the ratio of research resources devoted to a particular fi...

  10. Quantitative analysis of cascade impactor samples - revisited

    International Nuclear Information System (INIS)

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases

  11. Quantitative analysis of cascade impactor samples - revisited

    Energy Technology Data Exchange (ETDEWEB)

    Orlic, I.; Chiam, S.Y.; Sanchez, J.L.; Tang, S.M

    1999-04-02

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 {mu}m) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  12. A Quantitative Analysis of Countries’ Research Strengths

    Directory of Open Access Journals (Sweden)

    Anurag Saxena

    2009-05-01

    Full Text Available This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding. Research output from 34 countries is examined using a conceptual framework that emphasizes the ratio of research resources devoted to a particular field to research output measured by publications in peer-reviewed journals. Using cluster analysis and k-means analysis, we conclude that countries’ research output (as measured by the number of published peer-reviewed articles and their efficiency (as measured by a ratio of research output to dollars allocated to research together indicate a comparative advantage within any given country’s own menu of research choices and an absolute advantage relative to other countries. This study implies that the more countries engage in publication in areas of relative strength and consume research in areas of relative weakness, the stronger their entire research agenda will become.

  13. The EVB as a quantitative tool for formulating simulations and analyzing biological and chemical reactions

    OpenAIRE

    Kamerlin, Shina C. L.; Warshel, Arieh

    2010-01-01

    Recent years have seen dramatic improvements in computer power, allowing ever more challenging problems to be approached. In light of this, it is imperative to have a quantitative model for examining chemical reactivity, both in the condensed phase and in solution, as well as to accurately quantify physical organic chemistry (particularly as experimental approaches can often be inconclusive). Similarly, computational approaches allow for great progress in studying enzyme catalysis, as they al...

  14. Quantitative Chemical-Genetic Interaction Map Connects Gene Alterations to Drug Responses | Office of Cancer Genomics

    Science.gov (United States)

    In a recent Cancer Discovery report, CTD2 researchers at the University of California in San Francisco developed a new quantitative chemical-genetic interaction mapping approach to evaluate drug sensitivity or resistance in isogenic cell lines. Performing a high-throughput screen with isogenic cell lines allowed the researchers to explore the impact of a panel of emerging and established drugs on cells overexpressing a single cancer-associated gene in isolation.

  15. Linking tumor mutations to drug responses via a quantitative chemical-genetic interaction map

    OpenAIRE

    Maria M. Martins; Zhou, Alicia Y.; Corella, Alexandra; Horiuchi, Dai; Yau, Christina; Rakshandehroo, Taha; Gordan, John D; Levin, Rebecca S.; Johnson, Jeff; Jascur, John; Shales, Mike; Sorrentino, Antonio; Cheah, Jaime; Clemons, Paul A.; Shamji, Alykhan F.

    2014-01-01

    There is an urgent need in oncology to link molecular aberrations in tumors with therapeutics that can be administered in a personalized fashion. One approach identifies synthetic-lethal genetic interactions or dependencies that cancer cells acquire in the presence of specific mutations. Using engineered isogenic cells, we generated a systematic and quantitative chemical-genetic interaction map that charts the influence of 51 aberrant cancer genes on 90 drug responses. The dataset strongly pr...

  16. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  17. QUANTITATIVE ANALYSIS OF DRAWING TUBES MICROSTRUCTURE

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2009-05-01

    Full Text Available Final properties of forming pieces are affected by production, at first conditions of mechanical working. Application of stereology methods to statistic reconstruction of three-dimensional plastic deformed material structure by bulk forming led to detail analysis of material structure changes. The microstructure of cold drawing tubes from STN 411353 steel was analyzed. Grain boundaries orientation was measured on perpendicular and parallel section of tubes with different degree of deformation. Macroscopic deformation leads to grain boundaries deformation and these ones were compared.

  18. Financial indicators for municipalities: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    Sreĉko Devjak

    2009-12-01

    Full Text Available From the characterization of Local Authority financing models and structures in Portugal and Slovenia, a set of financial and generic budget indicators has been established. These indicators may be used in a comparative analysis considering the Bragança District in Portugal, and municipalities of similar population size in Slovenia. The research identified significant differences, in terms of financing sources due to some discrepancies on financial models and competences of municipalities on each country. The results show that Portuguese and Slovenian municipalities, in 2003, for the economy indicator, had similar ranking behaviour, but in 2004, they changed this behaviour.

  19. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    International Nuclear Information System (INIS)

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies

  20. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-Ruiz, R. [Servicio Interdepartamental de Investigacion, Facultad de Ciencias, Universidad Autonoma de Madrid, Modulo C-9, Laboratorio de TXRF, Crta. Colmenar, Km 15, Cantoblanco, E-28049, Madrid (Spain)], E-mail: ramon.fernandez@uam.es; Garcia-Heras, M. [Grupo de Arqueometria de Vidrios y Materiales Ceramicos, Instituto de Historia, Centro de Ciencias Humanas y Sociales, CSIC, C/ Albasanz, 26-28, 28037 Madrid (Spain)

    2008-09-15

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies.

  1. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  2. Quantitative analysis of in vivo cell proliferation.

    Science.gov (United States)

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  3. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples

  4. Quantitative group-type analysis of coal-tar pitches

    Energy Technology Data Exchange (ETDEWEB)

    Membrado, L.; Cebolla, V.L.; Vela, J. [Instituto de carboquimica, Zaragoza (Spain)

    1995-12-31

    Preparative liquid chromatographic (LC) and related techniques (e.g.., extrography) are mostly used for quantitative compound class or group-type analysis of coal-tar pitches. TLC-FID has hardly been used for this purpose because of the time-consuming calibration steps required. As the FID response of each peak depends on its nature, the classical approach in fossil fuel analysis to a quantitative analysis is the absolute calibration using fractions derived from the fossil fuel itself (previously isolated by LC) as external standards. An added problem is the isolation of these fractions with the required purity. A TLC-FID system has previously been described in this issue, which gives adequate repeatability and precision, and gives a quantitative FID response. In this work, a rapid calibration procedure which allows a quantitative group-type analysis of a whole coal-tar pitch (without any prefractionation) using TLC-FID is presented as an alternative to absolute calibration. This method considerably reduces the total time of analysis. Likewise, the use of TLC-FID as a monitoring technique to improve the classical absolute calibration is also proposed. Pros and cons of group-type analysis techniques are finally discussed with regard to TLC-FID.

  5. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    Science.gov (United States)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  6. Control of separation and quantitative analysis by GC-FTIR

    Science.gov (United States)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  7. Multiple Trait Analysis of Genetic Mapping for Quantitative Trait Loci

    OpenAIRE

    Jiang, C.; Zeng, Z B

    1995-01-01

    We present in this paper models and statistical methods for performing multiple trait analysis on mapping quantitative trait loci (QTL) based on the composite interval mapping method. By taking into account the correlated structure of multiple traits, this joint analysis has several advantages, compared with separate analyses, for mapping QTL, including the expected improvement on the statistical power of the test for QTL and on the precision of parameter estimation. Also this joint analysis ...

  8. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  9. Quantitative and Qualitative Analysis of Surface Modified Cellulose Utilizing TGA-MS

    OpenAIRE

    Daniel Loof; Matthias Hiller; Hartmut Oschkinat; Katharina Koschek

    2016-01-01

    With the aim to enhance interfacial adhesion of a hydrophobic polymer matrix and cellulosic fibers and fillers, chemical surface modifications with silane coupling agents are performed. Thermogravimetric analysis (TGA) could be used to determine the degree of surface functionalization. However, similar thermal properties of treated and untreated cellulose hamper a precise determination of silane loading. This contribution deals with quantitative determination of silane loading combining both ...

  10. Chemical analysis of water in hydrogeology

    International Nuclear Information System (INIS)

    The aim of the monograph is to give complete information on the chemical analysis of water hydrogeology not only for the students program of Geology study (Bachelor degree study), Engineering Geology and Hydrogeology (Master's degree study) and Engineering Geology (doctoral level study), but also for students from other colleges and universities schools in Slovakia, as well as in the Czech Republic, dealing with the chemical composition of water and its quality, from different perspectives. The benefit would be for professionals with hydrogeological, water and environmental practices, who can find there all the necessary information about proper water sampling, the units used in the chemical analysis of water, expressing the proper chemical composition of water in its various parameters through classification of chemical composition of the water up to the basic features of physical chemistry at thermodynamic calculations and hydrogeochemical modelling.

  11. Rapid inorganic ion analysis using quantitative microchip capillary electrophoresis

    NARCIS (Netherlands)

    Vrouwe, Elwin X.; Lüttge, Regina; Olthuis, Wouter; Berg, van den Albert

    2006-01-01

    Rapid quantitative microchip capillary electrophoresis (CE) for online monitoring of drinking water enabling inorganic ion separation in less than 15s is presented. Comparing cationic and anionic standards at different concentrations the analysis of cationic species resulted in non-linear calibratio

  12. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis

    DEFF Research Database (Denmark)

    Queiroz, Rayner M L; Charneau, Sebastien; Mandacaru, Samuel C;

    2014-01-01

    well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of the T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, iTRAQ-labeled and...

  13. Quantitative Analysis for Authentication of Low-cost RFID Tags

    CERN Document Server

    Paparrizos, Ioannis; Petridou, Sophia

    2011-01-01

    Formal analysis techniques are widely used today in order to verify and analyze communication protocols. In this work, we launch a quantitative verification analysis for the low- cost Radio Frequency Identification (RFID) protocol proposed by Song and Mitchell. The analysis exploits a Discrete-Time Markov Chain (DTMC) using the well-known PRISM model checker. We have managed to represent up to 100 RFID tags communicating with a reader and quantify each RFID session according to the protocol's computation and transmission cost requirements. As a consequence, not only does the proposed analysis provide quantitative verification results, but also it constitutes a methodology for RFID designers who want to validate their products under specific cost requirements.

  14. Terahertz Chemical Analysis of Exhaled Human Breath - Broad Essay of Chemicals

    Science.gov (United States)

    Branco, Daniela R.; Fosnight, Alyssa M.; Thomas, Jessica R.; Medvedev, Ivan R.

    2013-06-01

    Approximately 3000 chemicals are thought to be present in human breath. Of these chemicals, many are considered typical of exhaled air. Yet, others can allude to different disease pathologies. The detection of chemicals in breath could have many practical purposes in medicine and provide a noninvasive means of diagnostics. We have previously reported on detection of ethanol, methanol, and acetone in exhaled human breath using a novel sub-millimeter/THz spectroscopic approach. This paper reports on our most recent study. A tentative list has been made of approximately 20 chemicals previously found in breath using other methods. Though many of these chemicals are only expressed in samples from donors with certain pathologies, at the time of this submission we are able to detect and quantitatively measure acetaldehyde and dimethyl sulfide in the breath of several healthy donors. Additional tentatively identified chemicals have been seen using this approach. This presentation will explain our experimental procedures and present our most recent results in THz breath analysis. Prospects, challenges and future plans will be outlined and discussed.

  15. Visualisation and quantitative analysis of flat continuous water jet structure

    OpenAIRE

    Ščučka, J. (Jiří); M. Zeleňák; Foldyna, J.; Lehocká, D.; Votavová, H.

    2015-01-01

    The results of an experiment focused on the visualisation and structural analysis of flat continuous high-speed water jet used in descaling process are presented in this paper. The aim of the work was to test the applicability of the shadowgraph technique, combined with image processing and analysis methods, to visualise the water jet structure and analyse its main quantitative parameters. Volume percentage of water and air in the water jet structure, size of droplets and water bunches ...

  16. Quantitative numerical analysis of transient IR-experiments on buildings

    Science.gov (United States)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  17. Scanning tunneling microscopy on rough surfaces-quantitative image analysis

    Science.gov (United States)

    Reiss, G.; Brückl, H.; Vancea, J.; Lecheler, R.; Hastreiter, E.

    1991-07-01

    In this communication, the application of scanning tunneling microscopy (STM) for a quantitative evaluation of roughnesses and mean island sizes of polycrystalline thin films is discussed. Provided strong conditions concerning the resolution are satisfied, the results are in good agreement with standard techniques as, for example, transmission electron microscopy. Owing to its high resolution, STM can supply a better characterization of surfaces than established methods, especially concerning the roughness. Microscopic interpretations of surface dependent physical properties thus can be considerably improved by a quantitative analysis of STM images.

  18. Toxicity challenges in environmental chemicals: Prediction of human plasma protein binding through quantitative structure-activity relationship (QSAR) models

    Science.gov (United States)

    The present study explores the merit of utilizing available pharmaceutical data to construct a quantitative structure-activity relationship (QSAR) for prediction of the fraction of a chemical unbound to plasma protein (Fub) in environmentally relevant compounds. Independent model...

  19. 40 CFR 761.253 - Chemical analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Chemical analysis. 761.253 Section 761.253 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT... analysis. (a) Extract PCBs from the standard wipe sample collection medium and clean-up the extracted...

  20. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  1. Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution

    Science.gov (United States)

    Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo

    2016-05-01

    We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b

  2. Quantitative nanoscale analysis in 3D using electron tomography

    International Nuclear Information System (INIS)

    State-of-the-art electron tomography has been established as a powerful tool to image complex structures with nanometer resolution in 3D. Especially STEM tomography is used extensively in materials science in such diverse areas as catalysis, semiconductor materials, and polymer composites mainly providing qualitative information on morphology, shape and distribution of materials. However, for an increasing number of studies quantitative information, e.g. surface area, fractal dimensions, particle distribution or porosity are needed. A quantitative analysis is typically performed after segmenting the tomographic data, which is one of the main sources of error for the quantification. In addition to noise, systematic errors due to the missing wedge and due to artifacts from the reconstruction algorithm itself are responsible for these segmentation errors and improved algorithms are needed. This presentation will provide an overview of the possibilities and limitations of quantitative nanoscale analysis by electron tomography. Using catalysts and nano composites as applications examples, intensities and intensity variations observed for the 3D volume reconstructed by WBP and SIRT will be quantitatively compared to alternative reconstruction algorithms; implications for quantification of electron (or X-ray) tomographic data will be discussed and illustrated for quantification of particle size distributions, particle correlations, surface area, and fractal dimensions in 3D.

  3. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  4. Chemical analysis of high purity graphite

    International Nuclear Information System (INIS)

    The Sub-Committee on Chemical Analysis of Graphite was organized in April 1989, under the Committee on Chemical Analysis of Nuclear Fuels and Reactor Materials, JAERI. The Sub-Committee carried out collaborative analyses among eleven participating laboratories for the certification of the Certified Reference Materials (CRMs), JAERI-G5 and G6, after developing and evaluating analytical methods during the period of September 1989 to March 1992. The certified values were given for ash, boron and silicon in the CRM based on the collaborative analysis. The values for ten elements (Al, Ca, Cr, Fe, Mg, Mo, Ni, Sr, Ti, V) were not certified, but given for information. Preparation, homogeneity testing and chemical analyses for certification of reference materials were described in this paper. (author) 52 refs

  5. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  6. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  7. Spectroscopic chemical analysis methods and apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  8. Service activities of chemical analysis division

    International Nuclear Information System (INIS)

    Progress of the Division during the year of 1988 was described on the service activities for various R and D projects carrying out in the Institute, for the fuel fabrication and conversion plant, and for the post-irradiation examination facility. Relevant analytical methodologies developed for the chemical analysis of an irradiated fuel, safeguards chemical analysis, and pool water monitoring were included such as chromatographic separation of lanthanides, polarographic determination of dissolved oxygen in water, and automation on potentiometric titration of uranium. Some of the laboratory manuals revised were also included in this progress report. (Author)

  9. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  10. Quantitative investigation of red blood cell three-dimensional geometric and chemical changes in the storage lesion using digital holographic microscopy.

    Science.gov (United States)

    Jaferzadeh, Keyvan; Moon, Inkyu

    2015-11-01

    Quantitative phase information obtained by digital holographic microscopy (DHM) can provide new insight into the functions and morphology of single red blood cells (RBCs). Since the functionality of a RBC is related to its three-dimensional (3-D) shape, quantitative 3-D geometric changes induced by storage time can help hematologists realize its optimal functionality period. We quantitatively investigate RBC 3-D geometric changes in the storage lesion using DHM. Our experimental results show that the substantial geometric transformation of the biconcave-shaped RBCs to the spherocyte occurs due to RBC storage lesion. This transformation leads to progressive loss of cell surface area, surface-to-volume ratio, and functionality of RBCs. Furthermore, our quantitative analysis shows that there are significant correlations between chemical and morphological properties of RBCs. PMID:26502322

  11. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  12. Quantitative analysis of phenolic compounds of Inula candida (L.) Cass.

    OpenAIRE

    Maleš, Željan; HAZLER PILEPIĆ, KROATA; PETROVIĆ, LINA; BAGARIĆ, IVA

    2010-01-01

    Background and Purpose: Some species of the genus Inula L. are used in herbal medicine. Phenolic compounds contribute to many biological activities of the plants. No literature could be found in respect of the determination of the quantities of phenolic compounds of Inula candida. Quantitative analysis of phenolic compounds in different plant parts was therefore performed. Materials and Methods: Plant material was collected from different locations in Croatia. The content of phenoli...

  13. Modeling of Safety Functions in Quantitative Risk Analysis

    OpenAIRE

    Nguyen, Thien Duy

    2012-01-01

    Quantitative risk analysis in the offshore industry is mandated by the Norwegian legislation. A literature survey is carried out, related to the current legislation from the Norwegian Petroleum Safety Authority (PSA) and supporting NORSOK standards. Process accidents on offshore installations, operating on the Norwegian continental shelf are emphasized. A risk picture is the synthesis of a risk assessment, describing the risk level. Requirements to the risk picture are discussed, and associat...

  14. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  15. Quantitative Analysis of Physical and Statistical Properties of Flocks.

    OpenAIRE

    Bergdahl, Joakim; Almgren, Lars

    2014-01-01

    Flocking behavior is a common phenomenon in nature in the form of, for instance, flocks of birds or schools of fish. Making the assumption that the members of a flock can be considered a system of interacting particles it is possible to use methods from statistical physics to quantitatively analyze flocks and their properties. This way, a flock can exist in various thermodynamic phases and exhibit phase transitions depending on changes within the flock. In this report the flock analysis is pe...

  16. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina;

    2004-01-01

    translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required for...... delivery of newly synthesized tubulin to the growing nerve processes. Udgivelsesdato: 2004...

  17. Automatic quantitative analysis of morphology of apoptotic HL-60 cells

    OpenAIRE

    Liu, Yahui; Lin, Wang; Yang, Xu; Liang, Weizi; Zhang, Jun; Meng, Maobin; Rice, John R.; Sa, Yu; Feng, Yuanming

    2014-01-01

    Morphological identification is a widespread procedure to assess the presence of apoptosis by visual inspection of the morphological characteristics or the fluorescence images. The procedure is lengthy and results are observer dependent. A quantitative automatic analysis is objective and would greatly help the routine work. We developed an image processing and segmentation method which combined the Otsu thresholding and morphological operators for apoptosis study. An automatic determina...

  18. Quantitative Analysis of AGV System in FMS Cell Layout

    OpenAIRE

    B Ramana; S. Sudhakara Reddy; B. Ramprasad

    1997-01-01

    Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs) are invariably used for material handling in flexible manufacturing Systems (FMSs) due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system...

  19. A Quantitative Analysis of IRAS Maps of Molecular Clouds

    OpenAIRE

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps; this procedure allows us to measure quantitatively the difference bet...

  20. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  1. Quantitative Analysis of Explanatory Factors Disclosures in Hedge Fund

    OpenAIRE

    Zhou, Zhanxu

    2010-01-01

    In this paper, we carry out a series of quantitative analysis with an aim to provide a deeper insight about the adjusted excess returns of hedge fund indices and the potential risk factors that influence the strategies. In order to explore what potential explanatory factors involved in hedge fund, we extract eight hedge fund style indices from each of two famous database during the period from January 2001 to June 2010, and select typical potential risk factors which included common market in...

  2. Entropy generation reduction through chemical pinch analysis

    International Nuclear Information System (INIS)

    The pinch analysis (PA) concept emerged, late '80s, as one of the methods to address the energy management in the new era of sustainable development. It was derived from combined first and second law analysis, as a technique ensuring a better thermal integration, aiming the minimization of entropy production or, equivalently, exergy destruction by heat exchanger networks (HEN). Although its ascendance from the second law analysis is questionable, the PA reveals as a widespread tool, nowadays, helping in energy savings mostly through a more rational use of utilities. Unfortunately, as principal downside, one should be aware that the global minimum entropy production is seldom attained, since the PA does not tackle the whole plant letting aside the chemical reactors or separation trains. The chemical reactor network (CRN) is responsible for large amounts of entropy generation (exergy losses), mainly due to the combined composition and temperature change. The chemical pinch analysis (CPA) concept focuses on, simultaneously, the entropy generation reduction of both CRN and HEN, while keeping the state and working parameters of the plant in the range of industrial interest. The fundamental idea of CPA is to include the CRN (through the chemical reaction heat developed in reactors) into the HEN and to submit this extended system to the PA. This is accomplished by replacing the chemical reactor with a virtual heat exchanger system producing the same amount of entropy. For an endothermic non-adiabatic chemical reactor, the (stepwise infinitesimal) supply heat δq flows from a source (an external/internal heater) to the stream undergoing the chemical transformation through the reactor, which in turn releases the heat of reaction ΔHR to a virtual cold stream flowing through a virtual cooler. For an exothermic non-adiabatic chemical reactor, the replacement is likewise, but the heat flows oppositely. Thus, in the practice of designing or retrofitting a flowsheet, in order to

  3. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  4. Research in quantitative microscopic X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    A feasibility study of quantitative elemental microanalysis of biological materials and glass samples by microbeam X-ray fluorescence spectroscopy was completed. The research included testing the homogeneity of existing standards for X-ray fluorescence calibration and verification of a fundamental parameters method for quantitative analysis. The goal was to evaluate the X-ray fluorescence spectrometer as a tool for elemental analysis at the microscale level. Glass Standard Reference Materials were analyzed. The glass specimens consisted of flat, optically polished slabs having three different thicknesses. For calibration, metal thin films were used. The microbeam X-ray fluorescence spectrometer utilizing capillary optics with effective beam diameter equal to about 30 μm has been applied in this research. Sources of uncertainties considered in this work were detector and X-ray tube stability, specimen movement, and spectral deconvolution. Concentrations of analytes were calculated using a fundamental parameters approach. Coherently and incoherently scattered lines of tube target were used for matrix correction and to estimate the mass thickness of the sample. The synchrotron microbeam X-ray fluorescence technique was used for quantitative analysis of human brain tissue samples. In measurements the monochromatic and polychromatic synchrotron microbeams were applied. The same area of tissue sample was scanned with the use of both X-ray microbeams. The concentrations of selected elements were computed. A reasonably good agreement between results of both analyses was obtained

  5. Development of quantitative interspecies toxicity relationship modeling of chemicals to fish.

    Science.gov (United States)

    Fatemi, M H; Mousa Shahroudi, E; Amini, Z

    2015-09-01

    In this work, quantitative interspecies-toxicity relationship methodologies were used to improve the prediction power of interspecies toxicity model. The most relevant descriptors selected by stepwise multiple linear regressions and toxicity of chemical to Daphnia magna were used to predict the toxicities of chemicals to fish. Modeling methods that were used for developing linear and nonlinear models were multiple linear regression (MLR), random forest (RF), artificial neural network (ANN) and support vector machine (SVM). The obtained results indicate the superiority of SVM model over other models. Robustness and reliability of the constructed SVM model were evaluated by using the leave-one-out cross-validation method (Q(2)=0.69, SPRESS=0.822) and Y-randomization test (R(2)=0.268 for 30 trail). Furthermore, the chemical applicability domains of these models were determined via leverage approach. The developed SVM model was used for the prediction of toxicity of 46 compounds that their experimental toxicities to a fish were not being reported earlier from their toxicities to D. magna and relevant molecular descriptors. PMID:26002421

  6. QuantUM: Quantitative Safety Analysis of UML Models

    CERN Document Server

    Leitner-Fischer, Florian; 10.4204/EPTCS.57.2

    2011-01-01

    When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysi...

  7. Quantitative analysis for nonlinear fluorescent spectra based on edges matching

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A novel spectra-edge-matching approach is proposed for the quantitative analysis of the nonlinear fluorescence spectra of the air impurities excited by a femtosecond laser.The fluorescence spectra are first denoised and compressed,both by wavelet transform,and several peak groups are then picked from each spectrum according to a threshold of intensity and are used to extract the spectral features through principal component analysis.It is indicated that the first two principle components actually cover up to 98% of the total information and are sufficient for the final concentration analysis.The analysis reveals a monotone relationship between the spectra intensity and the concentration of the air impurities,suggesting that the femtosecond laser induced fluorescence spectroscopy along with the proposed spectra analysis method can become a powerful tool for monitoring environmental pollutants.

  8. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  9. Chemical Bond Analysis of Single Crystal Growth of Magnesium Oxide

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Starting from the crystallographic structure of magnesium oxide (MgO), both the chemical bond model of solids and Pauling's third rule (polyhedral sharing rule) were employed to quantitatively analyze the chemical bonding structure of constituent atoms and single crystal growth. Our analytical results show that MgO single crystals prefer to grow along the direction and the growth rate of the {100} plane is the slowest one. Therefore, the results show that the {100} plane of MgO crystals can be the ultimate morphology face, which is in a good agreement with our previous experimental results. The study indicate that the structure analysis is an effective tool to control the single-crystal growth.

  10. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  11. Automatic Facial Measurements for Quantitative Analysis of Rhinoplasty

    Directory of Open Access Journals (Sweden)

    Mousa Shamsi

    2007-08-01

    Full Text Available Proposing automated algorithms for quantitative analysis of facial images based on facial features may assist surgeons to validate the success of nose surgery in objective and reproducible manner. In this paper, we attempt to develop automatic procedures for quantitative analysis of rhinoplasty operation based on several standard linear and spatial features. The main processing steps include image enhancement, "ncorrection of varying illumination effect, automatic facial skin detection, automatic feature extraction, facial measurements and surgery analysis. For quantitative analysis of nose surgery, we randomly selected 100 patients from the database provided by the ENT division of Imam Hospital, Tehran, Iran. The frontal and profile images of these patients before and after rhinoplasty were available for experiments. For statistical analysis of nasal two clinical parameters, i.e., Nasolabial Angle and Nasal Projection ratio are computed. The mean and standard deviation of Nasolabial Angle by manual measurement of a specialist was 95.98˚(±9.58˚ and 111.02˚(±10.07˚ before and after nose surgery, respectively. The proposed algorithm has automatically computed this parameter as 94.12˚ (±8.86˚ and 109.65˚ (±8.86˚ before and after nose surgery. In addition, the proposed algorithm has automatically computed the Nasal Projection by Good's method as 0.584(±0.0491 and 0.537(±0.066 before and after nose surgery, respectively. Meanwhile, this parameter has manually been measured by a specialist as 0.576(±0.052 and 0.537(±0.077 before and after nose surgery, respectively. The result of the proposed facial skin segmentation, feature detection algorithms, and estimated values for the above two clinical parameters in the presence of the mentioned datasets declare that the techniques are applicable in the common clinical practice of the nose surgery.

  12. Quantitative epistasis analysis and pathway inference from genetic interaction data.

    Directory of Open Access Journals (Sweden)

    Hilary Phenix

    2011-05-01

    Full Text Available Inferring regulatory and metabolic network models from quantitative genetic interaction data remains a major challenge in systems biology. Here, we present a novel quantitative model for interpreting epistasis within pathways responding to an external signal. The model provides the basis of an experimental method to determine the architecture of such pathways, and establishes a new set of rules to infer the order of genes within them. The method also allows the extraction of quantitative parameters enabling a new level of information to be added to genetic network models. It is applicable to any system where the impact of combinatorial loss-of-function mutations can be quantified with sufficient accuracy. We test the method by conducting a systematic analysis of a thoroughly characterized eukaryotic gene network, the galactose utilization pathway in Saccharomyces cerevisiae. For this purpose, we quantify the effects of single and double gene deletions on two phenotypic traits, fitness and reporter gene expression. We show that applying our method to fitness traits reveals the order of metabolic enzymes and the effects of accumulating metabolic intermediates. Conversely, the analysis of expression traits reveals the order of transcriptional regulatory genes, secondary regulatory signals and their relative strength. Strikingly, when the analyses of the two traits are combined, the method correctly infers ~80% of the known relationships without any false positives.

  13. Quantitative Chemically-Specific Coherent Diffractive Imaging of Buried Interfaces using a Tabletop EUV Nanoscope

    CERN Document Server

    Shanblatt, Elisabeth R; Gardner, Dennis F; Mancini, Giulia F; Karl, Robert M; Tanksalvala, Michael D; Bevis, Charles S; Vartanian, Victor H; Kapteyn, Henry C; Adams, Daniel E; Murnane, Margaret M

    2016-01-01

    Characterizing buried layers and interfaces is critical for a host of applications in nanoscience and nano-manufacturing. Here we demonstrate non-invasive, non-destructive imaging of buried interfaces using a tabletop, extreme ultraviolet (EUV), coherent diffractive imaging (CDI) nanoscope. Copper nanostructures inlaid in SiO2 are coated with 100 nm of aluminum, which is opaque to visible light and thick enough that neither optical microscopy nor atomic force microscopy can image the buried interfaces. Short wavelength (29 nm) high harmonic light can penetrate the aluminum layer, yielding high-contrast images of the buried structures. Moreover, differences in the absolute reflectivity of the interfaces before and after coating reveal the formation of interstitial diffusion and oxidation layers at the Al-Cu and Al-SiO2 boundaries. Finally, we show that EUV CDI provides a unique capability for quantitative, chemically-specific imaging of buried structures, and the material evolution that occurs at these buried ...

  14. Quantitative study of nanoscale precipitates in Al–Zn–Mg–Cu alloys with different chemical compositions

    International Nuclear Information System (INIS)

    The present work gives a quantitative study on the nanoscale precipitates in Al–Zn–Mg–Cu alloys with 6 different chemical compositions (all are in peak-aged state), by combining synchrotron-based small-angle X-ray scattering (SAXS) and transmission electron microscopy (TEM) techniques. Based on the TEM observations, the size, shape, interprecipitate distance, volume fraction and number density of precipitates are extracted from SAXS data through model fitting. The results show that after peak-aging treatment, the average precipitate sizes in different alloys are close to each other, whereas the volume fraction and number density of nanoscale precipitates increase with the increasing content of Zn+Mg+Cu as well as the Zn/Mg ratio. The results also show that the alloy with a higher number density/volume fraction of nano-scale precipitates tends to have a higher mechanical strength

  15. Quantitative analysis on tectonic deformation of active rupture zones

    Institute of Scientific and Technical Information of China (English)

    JIANG Zai-sen; NIU An-fu; WANG Min; LI Kai-wu; FANG Ying; ZHANG Xi; ZHANG Xiao-liang

    2005-01-01

    Based on the regional GPS data of high spatial resolution, we present a method of quantitative analysis on the tectonic deformation of active rupture zones in order to predict the location of forthcoming major earthquakes. Firstly we divide the main fault area into certain deformation units, then derive the geometric deformation and relative dislocation parameters of each unit and finally estimate quantitatively the slip and strain rates in each segment of the rupture zone. Furthermore, by comparing the consistency of deformation in all segments of the whole rupture zone, we can determine the possible anomalous segments as well as their properties and amplitudes. In analyzing the eastern boundaries of Sichuan-Yunnan block with the GPS velocity data for the period of 1991~2001, we have discovered that the Mianning-Ningnan-Dongchuan segment on the Zemuhe-Xiaojiang fault zone is relatively locked and the left-lateral shear strain rate here is higher.

  16. Segregation Analysis on Genetic System of Quantitative Traits in Plants

    Institute of Scientific and Technical Information of China (English)

    Gai Junyi

    2006-01-01

    Based on the traditional polygene inheritance model of quantitative traits,the author suggests the major gene and polygene mixed inheritance model.The model was considered as a general one,while the pure major gene and pure polygene inheritance model was a specific case of the general model.Based on the proposed theory,the author established the segregation analysis procedure to study the genetic system of quantitative traits of plants.At present,this procedure can be used to evaluate the genetic effect of individual major genes (up to two to three major genes),the collective genetic effect of polygene,and their heritability value.This paper introduces how to establish the procedure,its main achievements,and its applications.An example is given to illustrate the steps,methods,and effectiveness of the procedure.

  17. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    Science.gov (United States)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  18. Developing a Semi-Quantitative Occupational Risk Prediction Model for Chemical Exposures and Its Application to a National Chemical Exposure Databank

    Directory of Open Access Journals (Sweden)

    Chiu-Ying Chen

    2013-07-01

    Full Text Available In this study, a semi-quantitative occupational chemical exposure risk prediction model, based on the calculation of exposure hazard indexes, was proposed, corrected, and applied to a national chemical exposure databank. The model comprises one factor used to describe toxicity (i.e., the toxicity index, and two factors used to reflect the exposure potential (i.e., the exposure index and protection deficiency index of workers exposed to chemicals. An expert system was used to correct the above proposed model. By applying the corrected model to data obtained from a national occupational chemical hazard survey program, chemical exposure risks of various manufacturing industries were determined and a national control strategy for the abatement of occupational chemical exposures was proposed. The results of the present study would provide useful information for governmental agencies to allocate their limited resources effectively for reducing chemical exposures of workers.

  19. Qualitative and quantitative analysis of volatile constituents from latrines.

    Science.gov (United States)

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  20. Quantitative analysis of radiation-induced changes in sperm morphology

    International Nuclear Information System (INIS)

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure

  1. Quantitative analysis of radiation-induced changes in sperm morphology

    Energy Technology Data Exchange (ETDEWEB)

    Young, I.T.; Gledhill, B.L.; Lake, S.; Wyrobek, A.J.

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure.

  2. Quantitative analysis of radiation-induced changes in sperm morphology.

    Science.gov (United States)

    Young, I T; Gledhill, B L; Lake, S; Wyrobek, A J

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure. PMID:6184000

  3. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  4. Chemical abundance analysis of 19 barium stars

    CERN Document Server

    Yang, G C; Spite, M; Chen, Y Q; Zhao, G; Zhang, B; Liu, G Q; Liu, Y J; Liu, N; Deng, L C; Spite, F; Hill, V; Zhang, C X

    2016-01-01

    We aim at deriving accurate atmospheric parameters and chemical abundances of 19 barium (Ba) stars, including both strong and mild Ba stars, based on the high signal-to-noise ratio and high resolution Echelle spectra obtained from the 2.16 m telescope at Xinglong station of National Astronomical Observatories, Chinese Academy of Sciences. The chemical abundances of the sample stars were obtained from an LTE, plane-parallel and line-blanketed atmospheric model by inputting the atmospheric parameters (effective temperatures, surface gravities, metallicity and microturbulent velocity) and equivalent widths of stellar absorption lines. These samples of Ba stars are giants indicated by atmospheric parameters, metallicities and kinematic analysis about UVW velocity. Chemical abundances of 17 elements were obtained for these Ba stars. Their light elements (O, Na, Mg, Al, Si, Ca, Sc, Ti, V, Cr, Mn and Ni) are similar to the solar abundances. Our samples of Ba stars show obvious overabundances of neutron-capture (n-ca...

  5. Quantitative option analysis for implementation and management of landfills.

    Science.gov (United States)

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  6. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    Science.gov (United States)

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  7. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  8. Controlling the accuracy of chemical analysis

    International Nuclear Information System (INIS)

    Most of the IAEA reference materials are certified in intercomparisons by calculation of the overall mean of reported laboratory mean values. IAEA certification is provided at ''A level'' (satisfactory, or high degree of confidence), or at ''B level'' (acceptable, or reasonable degree of confidence) sampling , storage and preliminary processing, use of reliable analytical methods, internal and external control of accuracy and reliability result in excellent certified reference materials for inorganic, geologic, environmental, biological and other quantitative analysis by means of conventional and nuclear methods. 34 refs, 4 figs, 3 tabs

  9. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  10. Using Matlab in quantitative analysis of yeast growth

    Czech Academy of Sciences Publication Activity Database

    Schier, Jan; Kovář, Bohumil

    Praha : Humusoft, s.r.o, 2009, s. 1-9. ISBN 978-80-7080-733-0. [Technical Computing Prague 2009. Praha (CZ), 19.11.2009] R&D Projects: GA MŠk(CZ) 1M0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : image processing * Petri dish * Matlab * image segmentation Subject RIV: JC - Computer Hardware ; Software http://library.utia.cas.cz/separaty/2009/ZS/schier-using matlab in quantitative analysis of yeast growth.pdf

  11. Quantitative analysis of nuclear personnel and problems of personnel training

    International Nuclear Information System (INIS)

    Recently it has come to understand there is no sensible alternative to nuclear power if we are to cope with global warming and sustain civilization. To construct new or additional nuclear power plants in the world, a great number of engineers and scientists will be needed. Results of the quantitative analysis of the nuclear personnel trend including the personnel demand based on future domestic and overseas markets were presented with the necessity and problems of nuclear personnel training in Japan. Japanese government has started 'Nuclear Personnel Training Program' since FY2007. (T. Tanaka)

  12. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    in a number work cases. These mainly falls in the three categories: (i) Description of coarse scale measures to quantify surface structure or texture (topography); (ii) Characterization of fracture surfaces in steels (fractography); (iii) Grain boundary segmentation in sintered ceramics. The...... theoretical foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the...

  13. Micro-PIXE for the quantitative imaging of chemical elements in single cells

    Energy Technology Data Exchange (ETDEWEB)

    Ortega, R. [Univ. Bordeaux, CENBG, Gradignan (France); CNRS, IN2P3, CENBG, Gradignan (France)

    2013-07-01

    Full text: The knowledge of the intracellular distribution of biological relevant metals is important to understand their mechanisms of action in cells, either for physiological, toxicological or pathological processes. However, the direct detection of trace metals in single cells is a challenging task that requires sophisticated analytical developments. The aim of this seminar will be to present the recent achievements in this field using micro-PIXE analysis. The combination of micro-PIXE with RBS (Rutherford Backscattering Spectrometry) and STIM (Scanning Transmission lon Microscopy) allows the quantitative determination of trace metal content within sub-cellular compartments. The application of STlM analysis will be more specifically highlighted as it provides high spatial resolution imaging (<200 nm) and excellent mass sensitivity (<0.1 ng). Application of the STIM-PIXE-RBS methodology is absolutely needed when organic mass loss appears during PIXE-RBS irradiation. This combination of STIM-PIXE-RBS provides fully quantitative determination of trace element content, expressed in μg/g, which is a quite unique capability for micro-PIXE compared to other micro-analytical methods such as the electron and synchrotron X-ray fluorescence or the techniques based on mass spectrometry. Examples of micro-PIXE studies for subcellular imaging of trace elements in the various fields of interest will be presented such as metal-based toxicology, pharmacology, and neuro degeneration [1] R. Ortega, G. Devés, A. Carmona. J. R. Soc. Interface, 6, (2009) S649-S658. (author)

  14. Micro-PIXE for the quantitative imaging of chemical elements in single cells

    International Nuclear Information System (INIS)

    Full text: The knowledge of the intracellular distribution of biological relevant metals is important to understand their mechanisms of action in cells, either for physiological, toxicological or pathological processes. However, the direct detection of trace metals in single cells is a challenging task that requires sophisticated analytical developments. The aim of this seminar will be to present the recent achievements in this field using micro-PIXE analysis. The combination of micro-PIXE with RBS (Rutherford Backscattering Spectrometry) and STIM (Scanning Transmission lon Microscopy) allows the quantitative determination of trace metal content within sub-cellular compartments. The application of STlM analysis will be more specifically highlighted as it provides high spatial resolution imaging (<200 nm) and excellent mass sensitivity (<0.1 ng). Application of the STIM-PIXE-RBS methodology is absolutely needed when organic mass loss appears during PIXE-RBS irradiation. This combination of STIM-PIXE-RBS provides fully quantitative determination of trace element content, expressed in μg/g, which is a quite unique capability for micro-PIXE compared to other micro-analytical methods such as the electron and synchrotron X-ray fluorescence or the techniques based on mass spectrometry. Examples of micro-PIXE studies for subcellular imaging of trace elements in the various fields of interest will be presented such as metal-based toxicology, pharmacology, and neuro degeneration [1] R. Ortega, G. Devés, A. Carmona. J. R. Soc. Interface, 6, (2009) S649-S658. (author)

  15. Mass spectrometry analysis of polychlorinated biphenyls: chemical ionization and selected ion chemical ionization using methane as a reagent gas

    OpenAIRE

    RAYMOND E. MARCH; MILA D. LAUSEVIC; TATJANA M. VASILJEVIC

    2000-01-01

    In the present paper a quadrupole ion trap mass spectrometer, coupled with a gas chromatograph, was used to compare the electron impact ionization (EI) and chemical ionization (Cl) technique, in terms of their selectivity in polychlorinated biphenyls (PCBs) quantitative analysis. The experiments were carried out with a modified Varian SATURN III quadrupole ion-storage mass spectrometer equipped with Varian waveform generator, coupled with a gas chromatograph with DB-5 capillary column. The di...

  16. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99mTc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99mTc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  17. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  18. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm2 and 5–10 mm2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DLCO/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  19. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  20. Quantitative and fingerprinting analysis of Pogostemon cablin based on GC-FID combined with chemometrics.

    Science.gov (United States)

    Yang, Yinhui; Kong, Weijun; Feng, Huanhuan; Dou, Xiaowen; Zhao, Lianhua; Xiao, Qiang; Yang, Meihua

    2016-03-20

    In this study, a simple, sensitive and reliable gas chromatography-flame ionization detection (GC-FID) method is established for quantitative chemical fingerprinting of essential oils from Pogostemon cablin. Oil samples are prepared by hydrodistillation, with yields ranging from 0.73% to 2.02%. The two main components of the oil, patchouli alcohol and pogostone, were detected simultaneously in 36 samples and were found to have average contents of 43.07% and 7.84%, respectively. The method was validated in terms of linearity, sensitivity, precision, stability, and accuracy. All calibration curves showed excellent linearity (r(2)>0.9992) within the test ranges, and the relative standard deviation (RSD) values for intra- and inter-day precision were less than 1.5%, indicating a high degree of precision. The GC-FID chemical fingerprints of the 36 samples were established using 12 common peaks which account for over 90% of the total peak area. Chemometric techniques, including similarity analysis and hierarchical cluster analysis, were also employed to explore the similarities and outstanding consistencies among different P. cablin oil samples. The results demonstrate that chromatographic fingerprinting and quantitative analysis can be achieved simultaneously when evaluating quality and authenticating samples of P. cablin. PMID:26799976

  1. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  2. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  3. Graphene-Semiconductor Catalytic Nanodiodes for Quantitative Detection of Hot Electrons Induced by a Chemical Reaction.

    Science.gov (United States)

    Lee, Hyosun; Nedrygailov, Ievgen I; Lee, Young Keun; Lee, Changhwan; Choi, Hongkyw; Choi, Jin Sik; Choi, Choon-Gi; Park, Jeong Young

    2016-03-01

    Direct detection of hot electrons generated by exothermic surface reactions on nanocatalysts is an effective strategy to obtain insight into electronic excitation during chemical reactions. For this purpose, we fabricated a novel catalytic nanodiode based on a Schottky junction between a single layer of graphene and an n-type TiO2 layer that enables the detection of hot electron flows produced by hydrogen oxidation on Pt nanoparticles. By making a comparative analysis of data obtained from measuring the hot electron current (chemicurrent) and turnover frequency, we demonstrate that graphene's unique electronic structure and extraordinary material properties, including its atomically thin nature and ballistic electron transport, allow improved conductivity at the interface between the catalytic Pt nanoparticles and the support. Thereby, graphene-based nanodiodes offer an effective and facile way to approach the study of chemical energy conversion mechanisms in composite catalysts with carbon-based supports. PMID:26910271

  4. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO2+ZrO2) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying very

  5. Analysis of Chemical Technology Division waste streams

    International Nuclear Information System (INIS)

    This document is a summary of the sources, quantities, and characteristics of the wastes generated by the Chemical Technology Division (CTD) of the Oak Ridge National Laboratory. The major contributors of hazardous, mixed, and radioactive wastes in the CTD as of the writing of this document were the Chemical Development Section, the Isotopes Section, and the Process Development Section. The objectives of this report are to identify the sources and the summarize the quantities and characteristics of hazardous, mixed, gaseous, and solid and liquid radioactive wastes that are generated by the Chemical Technology Division (CTD) of the Oak Ridge National Laboratory (ORNL). This study was performed in support of the CTD waste-reduction program -- the goals of which are to reduce both the volume and hazard level of the waste generated by the division. Prior to the initiation of any specific waste-reduction projects, an understanding of the overall waste-generation system of CTD must be developed. Therefore, the general approach taken in this study is that of an overall CTD waste-systems analysis, which is a detailed presentation of the generation points and general characteristics of each waste stream in CTD. The goal of this analysis is to identify the primary waste generators in the division and determine the most beneficial areas to initiate waste-reduction projects. 4 refs., 4 figs., 13 tabs

  6. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  7. Studies on qualitative and quantitative chemical changes in gamma irradiated fish and fishery products in India

    International Nuclear Information System (INIS)

    Studies on qualitative and quantitative changes in lipids and allied constituents of fresh as well as salted dehydrated Indian mackerel (Rastrelliger kanagurta) during storage after gamma-irradiation have been carried out. The samples were evaluated subjectively as well by determining various physico-chemical parameters such as moisture content, total volatile basic nitrogen (TVBN), trimethyl amine nitrogen (TMAN), lipid content, iodine value (IV), peroxide value (PV), thiobarbituric acid (TBA) value, glyceride and free fatty acid (FFA) content. Shelf life extension of radurized fresh (150 Krad) as well as salted dehydrated (200 Krad) Indian mackerel during ice-temperature (0-20C) and ambient temperature (25-300C) storage, respectively, was observed without detectable rancidity and off-flavours. However, the extended storage life was dependent upon storage temperature in the case of fresh fish and upon moisture content in the case of salted dehydrated fish. Besides a suppression in TVBN and TMAN values, the changes in the physico-chemical parameters including the lipid composition of the irradiated samples in both cases were parallel to those in the unirradiated controls. No new compounds were detected in any of the lipid samples of the irradiated fish by thin-layer gas-liquid chromatography. Pasteurization dose of irradiation (200 Krad) did not influence the yield or the composition of the total volatiles of salted dehydrated fish. The commercial sun-dried products gave rise to two-fold increases in the yield of total volatiles, which showed composition comparable to that of the laboratory processed irradiated samples

  8. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    Sujay Rakshit; Arunita Rakshit; J. V. Patil

    2011-04-01

    Most traits of interest to medical, agricultural and animal scientists show continuous variation and complex mode of inheritance. DNA-based markers are being deployed to analyse such complex traits, that are known as quantitative trait loci (QTL). In conventional QTL analysis, F2, backcross populations, recombinant inbred lines, backcross inbred lines and double haploids from biparental crosses are commonly used. Introgression lines and near isogenic lines are also being used for QTL analysis. However, such populations have major limitations like predominantly relying on the recombination events taking place in the F1 generation and mapping of only the allelic pairs present in the two parents. The second generation mapping resources like association mapping, nested association mapping and multiparent intercross populations potentially address the major limitations of available mapping resources. The potential of multiparent intercross populations in gene mapping has been discussed here. In such populations both linkage and association analysis can be conductted without encountering the limitations of structured populations. In such populations, larger genetic variation in the germplasm is accessed and various allelic and cytoplasmic interactions are assessed. For all practical purposes, across crop species, use of eight founders and a fixed population of 1000 individuals are most appropriate. Limitations with multiparent intercross populations are that they require longer time and more resource to be generated and they are likely to show extensive segregation for developmental traits, limiting their use in the analysis of complex traits. However, multiparent intercross population resources are likely to bring a paradigm shift towards QTL analysis in plant species.

  9. Screening of 397 chemicals and development of a quantitative structure-activity relationship model for androgen receptor antagonism

    DEFF Research Database (Denmark)

    Vinggaard, Annemarie; Niemelä, Jay Russell; Wedebye, Eva Bay; Jensen, Gunde Egeskov

    2008-01-01

    We have screened 397 chemicals for human androgen receptor (AR) antagonism by a sensitive reporter gene assay to generate data for the development of a quantitative structure-activity relationship (QSAR) model. A total of 523 chemicals comprising data on 292 chemicals from our laboratory and data...... synthetic androgen R1881. The MultiCASE expert system was used to construct a QSAR model for AR antagonizing potential. A "5 Times, 2-Fold 50% Cross Validation" of the model showed a sensitivity of 64%, a specificity of 84%, and a concordance of 76%. Data for 102 chemicals were generated for an external...... validation of the model resulting in a sensitivity of 57%, a specificity of 98%, and a concordance of 92% of the model. The model was run on a set of 176103 chemicals, and 47% were within the domain of the model. Approximately 8% of chemicals was predicted active for AR antagonism. We conclude that the...

  10. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  11. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-05-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  12. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  13. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  14. Quantitative Determination of Catechin as Chemical Marker in Pediatric Polyherbal Syrup by HPLC/DAD.

    Science.gov (United States)

    Sheikh, Zeeshan A; Siddiqui, Zafar A; Naveed, Safila; Usmanghani, Khan

    2016-09-01

    Vivabon syrup is a balanced composition of dietary ingredients of phytopharmaceutical nature for maintaining the physique, vigor, vitality and balanced growth of children. The herbal ingredients of pediatric syrup are rich in bioflavonoid, proteins, vitamins, glycosides and trace elements. Vivabon is formulated with herbal drugs such as Phoenix sylvestris, Emblica officinalis, Withania somnifera, Centella asiatica, Amomum subulatum, Zingiber officinalis, Trigonella foenum-graecum, Centaurea behen and Piper longum Catechins are flavan-3-ols that are found widely in the medicinal herbs and are utilized for anti-inflammatory, cardio protective, hepato-protective, neural protection and other biological activities. In general, the dietary intake of flavonoids has been regarded traditionally as beneficial for body growth. Standardization of Vivabon syrup dosage form using HPLC/DAD has been developed for quantitative estimation of Catechin as a chemical marker. The method was validated as per ICH guidelines. Validation studies demonstrated that the developed HPLC method is quite distinct, reproducible as well as quick and fast. The relatively high recovery and low comparable standard deviation confirm the suitability of the developed method for the determination of Catechin in syrup. PMID:27165575

  15. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  16. Quantitative analysis of vascular network of oculogyric nerve nuclei

    Directory of Open Access Journals (Sweden)

    Sladojević Igor

    2011-01-01

    Full Text Available Introduction. Nuclei of oculogyric nerves (principal oculomotor nucleus, trochlear nucleus and abducens nucleus are densely vascularized brain­stem structures. The aim of this study was to determine quantitative characteristics of the vascular network of these nuclei. Material and methods. The study was done on 30 adult brainstems, both male and female, without diagnosed neurological disturbances. Three-millimetrethick stratums were taken in transversal plane and cut in 0.3 micrometer semi-serial sections stained with Mallory method. The images of studied nuclei were taken with „Leica” DM 1000 microscope and „Leica” EC3 digital camera under 400x magnification, and analyzed by ImageJ software with A 100 grid. The statistical analysis was performed by Statistical Package for the Social Sciences software with 5% level of significance. Results. A statistically significant difference was found in the volume and surface density between principal oculomotor nucleus and trochlear nucleus, and between trochlear nucleus and abducens nucleus. No difference was found in the length density. Discussion. The results of this research match the results of studies on characteristics of vascular network of oculogyric nerve nuclei, while the comparison of vascular networks of these nuclei, substantia nigra, vestibulocochlear nuclei and precentral gyrus illustrates differences in quantitative characteristics of blood vessels in these structures. Conclusion. Blood vessels of principal oculomotor nucleus and abducens nucleus have similar dimensions and approximately the same arborization pattern, while vessels of trochlear nucleus have significantly smaller dimensions and density.

  17. Analysis of generalized interictal discharges using quantitative EEG.

    Science.gov (United States)

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  18. Quantitative trace element analysis with sub-micron lateral resolution

    International Nuclear Information System (INIS)

    In recent years many nuclear microprobes have developed to sophisticated tools for elemental analysis with high resolutions down to about 1 μm micron. The application to trace element analysis is mainly in the field of biological and medical research. Numerous successful studies on microscopic scale structures, e.g. cells, lead to the demand for higher spatial resolution or lower detection limits. Therefore, several labs started new efforts for sub-micron resolutions, sometimes intending 100 nm. The Leipzig microprobe laboratory LIPSION has recently improved its analytical capabilities. We are now able to perform quantitative trace element analysis with sub-micron spatial resolution (beam diameter 0.5 μm at 120 pA). As an example we give the trace element distribution in neuromelanin (intracellular pigment of neurons). Furthermore, when the scan size is reduced from cellular level, i.e. about 50 μm, to sub-cellular level of about 10 μm, the beam diameter can further be reduced by choosing smaller object diaphragms. The unavoidable reduction in beam current will not affect the mapping sensitivity unless the accumulated charge per spatial resolution is not decreased. The smallest beam diameter with analytical capabilities for elemental analysis we achieved thus far was about 300 nm in diameter. It enables an outstanding microPIXE resolution. However, some difficulties appeared in high-resolution work, which limited the acquisition time to less than 30 min

  19. Quantitative trace element analysis with sub-micron lateral resolution

    Energy Technology Data Exchange (ETDEWEB)

    Reinert, Tilo [Nukleare Festkoerperphysik, Universitaet Leipzig, Linnestr. 5, D-04103 Leipzig (Germany)]. E-mail: reinert@physik.uni-leipzig.de; Spemann, Daniel [Nukleare Festkoerperphysik, Universitaet Leipzig, Linnestr. 5, D-04103 Leipzig (Germany); Morawski, Markus [Paul-Flechsig-Institut fuer Hirnforschung, Universitaet Leipzig, Jahnallee 59, 04109 Leipzig (Germany); Arendt, Thomas [Paul-Flechsig-Institut fuer Hirnforschung, Universitaet Leipzig, Jahnallee 59, 04109 Leipzig (Germany)

    2006-08-15

    In recent years many nuclear microprobes have developed to sophisticated tools for elemental analysis with high resolutions down to about 1 {mu}m micron. The application to trace element analysis is mainly in the field of biological and medical research. Numerous successful studies on microscopic scale structures, e.g. cells, lead to the demand for higher spatial resolution or lower detection limits. Therefore, several labs started new efforts for sub-micron resolutions, sometimes intending 100 nm. The Leipzig microprobe laboratory LIPSION has recently improved its analytical capabilities. We are now able to perform quantitative trace element analysis with sub-micron spatial resolution (beam diameter 0.5 {mu}m at 120 pA). As an example we give the trace element distribution in neuromelanin (intracellular pigment of neurons). Furthermore, when the scan size is reduced from cellular level, i.e. about 50 {mu}m, to sub-cellular level of about 10 {mu}m, the beam diameter can further be reduced by choosing smaller object diaphragms. The unavoidable reduction in beam current will not affect the mapping sensitivity unless the accumulated charge per spatial resolution is not decreased. The smallest beam diameter with analytical capabilities for elemental analysis we achieved thus far was about 300 nm in diameter. It enables an outstanding microPIXE resolution. However, some difficulties appeared in high-resolution work, which limited the acquisition time to less than 30 min.

  20. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    Science.gov (United States)

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. PMID:26833935

  1. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    Science.gov (United States)

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  2. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  3. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  4. Sensitive LC MS quantitative analysis of carbohydrates by Cs+ attachment.

    Science.gov (United States)

    Rogatsky, Eduard; Jayatillake, Harsha; Goswami, Gayotri; Tomuta, Vlad; Stein, Daniel

    2005-11-01

    The development of a sensitive assay for the quantitative analysis of carbohydrates from human plasma using LC/MS/MS is described in this paper. After sample preparation, carbohydrates were cationized by Cs(+) after their separation by normal phase liquid chromatography on an amino based column. Cesium is capable of forming a quasi-molecular ion [M + Cs](+) with neutral carbohydrate molecules in the positive ion mode of electrospray ionization mass spectrometry. The mass spectrometer was operated in multiple reaction monitoring mode, and transitions [M + 133] --> 133 were monitored (M, carbohydrate molecular weight). The new method is robust, highly sensitive, rapid, and does not require postcolumn addition or derivatization. It is useful in clinical research for measurement of carbohydrate molecules by isotope dilution assay. PMID:16182559

  5. Standard guide for quantitative analysis by energy-dispersive spectroscopy

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1998-01-01

    1.1 This guide is intended to assist those using energy-dispersive spectroscopy (EDS) for quantitative analysis of materials with a scanning electron microscope (SEM) or electron probe microanalyzer (EPMA). It is not intended to substitute for a formal course of instruction, but rather to provide a guide to the capabilities and limitations of the technique and to its use. For a more detailed treatment of the subject, see Goldstein, et al. This guide does not cover EDS with a transmission electron microscope (TEM). 1.2 Units—The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  6. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio (144Ce + 144Pr activity)/ 137Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141Ce, 144Ce + 144Pr, 103Ru, 106Ru + 106Rh, 137Cs, 95Zr + 95Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author)

  7. Quantitative x-ray fluorescence analysis using monochromatic synchrotron radiation

    International Nuclear Information System (INIS)

    The use of high-intensity, tunable monochromatic x-rays for the quantitative analysis of biological and geochemical specimens at the 10-8 g level is described. Incident x-rays were obtained from the new LBL-EXXON permanent magnet wiggler beamline at the Stanford Synchrotron Radiation Laboratory. The sample detector geometry was designed to make optimal use of polarization advantages for background reduction. Questions regarding the sensitivity and accuracy of the measurements were studied with particular emphasis on the advantages of tuning the x-ray energies for optimum excitation for specific elements. The implications of these measurements with respect to the use of x-ray microprobe beams will be discussed

  8. Quantitative genetic analysis of injury liability in infants and toddlers

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  9. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  10. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  11. QTL analysis for some quantitative traits in bread wheat

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection.

  12. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  13. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  14. Value of MRI in differentiating adrenal masses: Quantitative analysis of tumor signal intensity

    International Nuclear Information System (INIS)

    Several reports show that MR imaging, especially using the chemical-shift sequence, provides highly reliable differentiation of adrenal adenomas from non-adenomas. The aim of the study was to evaluate the ability of MRI to distinguish adenomas from other adrenal masses using quantitative analysis of the tumors' signal intensity. Fifty-four patients with 57 adrenal masses underwent MRI. The tumors were determined during surgery as pheochromocytomas, metastases, adrenal cortical carcinoma, and adenomas. Nineteen masses were diagnosed as adenomas on the basis of stability on imaging followup and the absence of clinical and endocrinological dysfunction. Chemical-shift-weighted images (T1TFE sequence) and T2-weighted images (TSE sequence) were used for quantitative analysis which included the T2 index (adrenal mass SI to liver SI ratio) and the CSI ratio (the adrenal mass SI on the in-phase image minus the adrenal mass SI on the opposed-phase image divided by the adrenal mass SI on the in-phase image). Statistical analysis was performed with the Mann-Whitney U test. Receiver operating characteristic (ROC) analysis of the calculated parameters was performed. Significant differences in T2 index between adenomas (mean: 1.43±0.50) and pheochromocytomas (2.66±0.67) as well as between metastases (1.64±0.22) and pheochromocytomas were noted (p≤0.05). The Mann-Whitney U test revealed no significant difference in T2 index for adenomas vs. metastases (p=0.1). The CSI ratio was significantly different for adenomas (0.36±0.18) vs. pheochromocytomas (-0.15±0.16) as well as for adenomas vs. metastases (-0.23±0.26). No significant difference occurred in the CSI ratios between pheochromocytomas and metastases. ROC analysis showed that the discriminatory ability of adenoma diagnosis with the CSI ratio is better than with the T2 index (areas under the ROC curve: 0.980 vs. 0.867). Quantitative methods using signal intensity ratios and indexes calculated from MR images are helpful

  15. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  16. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  17. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Science.gov (United States)

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  18. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  19. Quantitative structure activity relationship model for predicting the depletion percentage of skin allergic chemical substances of glutathione.

    Science.gov (United States)

    Si, Hongzong; Wang, Tao; Zhang, Kejun; Duan, Yun-Bo; Yuan, Shuping; Fu, Aiping; Hu, Zhide

    2007-05-22

    A quantitative model was developed to predict the depletion percentage of glutathione (DPG) compounds by gene expression programming (GEP). Each kind of compound was represented by several calculated structural descriptors involving constitutional, topological, geometrical, electrostatic and quantum-chemical features of compounds. The GEP method produced a nonlinear and five-descriptor quantitative model with a mean error and a correlation coefficient of 10.52 and 0.94 for the training set, 22.80 and 0.85 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones, better than those of the heuristic method. PMID:17481417

  20. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  1. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  2. Chinese SPECT semi-quantitative analysis of striatum

    International Nuclear Information System (INIS)

    To study and analyze reconstruction parameters in 99mTc-Trodat1 SPECT (Tomography Emission- Computed Single-Photon) brain semi-quantitative analysis, and the outlined methods of ROIs (regions of interests). The 99mTc-trodat-1 SPECT brain imaging was processed, and the ROIs were outlined four times from December 2009 to July 2011. The results for each method were analyzed and compared to improve our experience. There was no statistically significance between the outlined number of pixel in both sides of the cerebellum and the ratio of the radioactive counts. From LEHR and FAN beam collimator, the average ratio became better with increasing the outlining method, especially the fourth time. It could be estimated that striatum mean volume (±SD) was (39.51±9.54 ml, in the range of 19.97-51.98 mL) and mean weight (±SD) was (44.09±10.64 g in the range of 22.28-58.01 g). The details of image processing and data analysis should not be ignored to outline appropriate methods, and withstand the repetitive inspection. The total striatum was outlined and its volume and weight was analyzed. (authors)

  3. Inside single cells: quantitative analysis with advanced optics and nanomaterials.

    Science.gov (United States)

    Cui, Yi; Irudayaraj, Joseph

    2015-01-01

    Single-cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites, and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single-cell activity. To obtain quantitative information (e.g., molecular quantity, kinetics, and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single-cell studies, both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live-cell analysis. Although a considerable proportion of single-cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single-cell analysis. PMID:25430077

  4. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  5. Systems analysis of past, present, and future chemical terrorism scenarios.

    Energy Technology Data Exchange (ETDEWEB)

    Hoette, Trisha Marie

    2012-03-01

    Throughout history, as new chemical threats arose, strategies for the defense against chemical attacks have also evolved. As a part of an Early Career Laboratory Directed Research and Development project, a systems analysis of past, present, and future chemical terrorism scenarios was performed to understand how the chemical threats and attack strategies change over time. For the analysis, the difficulty in executing chemical attack was evaluated within a framework of three major scenario elements. First, historical examples of chemical terrorism were examined to determine how the use of chemical threats, versus other weapons, contributed to the successful execution of the attack. Using the same framework, the future of chemical terrorism was assessed with respect to the impact of globalization and new technologies. Finally, the efficacy of the current defenses against contemporary chemical terrorism was considered briefly. The results of this analysis justify the need for continued diligence in chemical defense.

  6. A Quantitative High-Throughput Screening Data Analysis Pipeline for Activity Profiling.

    Science.gov (United States)

    Huang, Ruili

    2016-01-01

    The US Tox21 program has developed in vitro assays to test large collections of environmental chemicals in a quantitative high-throughput screening (qHTS) format, using triplicate 15-dose titrations to generate over 50 million data points to date. Counter screens are also employed to minimize interferences from non-target-specific assay artifacts, such as compound auto fluorescence and cytotoxicity. New data analysis approaches are needed to integrate these data and characterize the activities observed from these assays. Here, we describe a complete analysis pipeline that evaluates these qHTS data for technical quality in terms of signal reproducibility. We integrate signals from repeated assay runs, primary readouts, and counter screens to produce a final call on on-target compound activity. PMID:27518629

  7. Quantitative analysis of liquid jets breakup with SAXS

    International Nuclear Information System (INIS)

    Full text: The breakup of liquid jets represents a wide area of research in the field of multiphase flows, fully justified by their wide presence both in industrial and in scientific applications. Moreover, the recent development of microfluidic systems has raised great interest in understanding the flows in small spatial dimensions. Such interest has been further increased due to the evolution of free electron lasers and the consequent need to develop new, high throughput techniques to characterize biological macromolecules. Jet instability has been widely described both theoretically and by performing simulations, however there is still need to have accurate sets of experimental data. In fact most of them are based on light scattering which is disturbed by reflection, absorption and multiple scattering of droplets and air interface morphology, and on camera imaging which is limited by the dimension of the phenomena that can be seen. In the present communication we want to show the potentiality of synchrotron SAXS in providing quantitative information on the dynamics of liquid jets at the nanoscale. To this purpose, we have investigated free liquid jets in air with circular nozzle geometry of different diameters (450μm-100μm), flow rates (2-10 ml/min), and solvents (water, ethanol, isopropanol and their mixtures). We determined their time dependent morphology and their breakup length in the Rayleigh and rst wind-induced regimes. The resulting data are considered as basis for the use of free jet micromixers to examine the evolution of chemical and biological reactions by SAXS. (author)

  8. Quantitative solid state NMR analysis of residues from acid hydrolysis of loblolly pine wood.

    Science.gov (United States)

    Sievers, Carsten; Marzialetti, Teresita; Hoskins, Travis J C; Valenzuela Olarte, Mariefel B; Agrawal, Pradeep K; Jones, Christopher W

    2009-10-01

    The composition of solid residues from hydrolysis reactions of loblolly pine wood with dilute mineral acids is analyzed by (13)C Cross Polarization Magic Angle Spinning (CP MAS) NMR spectroscopy. Using this method, the carbohydrate and lignin fractions are quantified in less than 3h as compared to over a day using wet chemical methods. In addition to the quantitative information, (13)C CP MAS NMR spectroscopy provides information on the formation of additional extractives and pseudo lignin from the carbohydrates. Being a non-destructive technique, NMR spectroscopy provides unambiguous evidence of the presence of side reactions and products, which is a clear advantage over the wet chemical analytical methods. Quantitative results from NMR spectroscopy and proximate analysis are compared for the residues from hydrolysis of loblolly pine wood under 13 different conditions; samples were treated either at 150 degrees C or 200 degrees C in the presence of various acids (HCl, H(2)SO(4), H(3)PO(4), HNO(3) and TFA) or water. The lignin content determined by both methods differed on averaged by 2.9 wt% resulting in a standard deviation of 3.5 wt%. It is shown that solid degradation products are formed from saccharide precursors under harsh reaction conditions. These degradation reactions limit the total possible yield of monosaccharides from any subsequent reaction. PMID:19477123

  9. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    Science.gov (United States)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 quantitative AE analysis. We found AE magnitudes in the range -7.8 quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2) the use of a narrow frequency band for acquisition, (3) the inability to identify P and S waves given the small sample size, and (4) acquisition using a narrow amplitude range given a low signal to noise ratio. Moving forward to the final stage of this thesis, with the ability to characterize the sources of AE, we applied our method to study an engineering problem. We chose hydraulic fracturing because of its obvious importance in the future of Canadian energy production. During a hydraulic fracture treatment, whether in a lab or in the field, energy is added to the system via hydraulic pressure. The injection

  10. Quantitative and Qualitative Analysis of Surface Modified Cellulose Utilizing TGA-MS

    Directory of Open Access Journals (Sweden)

    Daniel Loof

    2016-05-01

    Full Text Available With the aim to enhance interfacial adhesion of a hydrophobic polymer matrix and cellulosic fibers and fillers, chemical surface modifications with silane coupling agents are performed. Thermogravimetric analysis (TGA could be used to determine the degree of surface functionalization. However, similar thermal properties of treated and untreated cellulose hamper a precise determination of silane loading. This contribution deals with quantitative determination of silane loading combining both TGA and elemental analysis. Firstly, silane modified celluloses were studied by FT-IR, Raman, solid state NMR spectroscopy, and polarized light microscopy in order to determine functional groups and to study the impact of chemical treatment on cellulose morphology. Secondly, thermal stability and pyrolysis processes were studied by TG-MS analysis. In order to determine the exact silane loading, the mass percentages of the appropriate elements were quantified by elemental analysis and correlated with the charred residues determined by TGA yielding a linear dependency. With that correlation, it was possible to determine silane loadings for additional samples utilizing simple TGA measurements. The main advantage of that approach is that only one calibration is necessary for routine analyses of further samples and TGA-MS coupling gives additional information on thermal stability and pyrolysis routes, simultaneously.

  11. Principal Component Analysis on Chemical Abundances Spaces

    CERN Document Server

    Ting, Y S; Kobayashi, C; De Silva, G M; Bland-Hawthorn, J

    2011-01-01

    [Shortened] In preparation for the HERMES chemical tagging survey of about a million Galactic FGK stars, we estimate the number of independent dimensions of the space defined by the stellar chemical element abundances [X/Fe]. [...] We explore abundances in several environments, including solar neighbourhood thin/thick disk stars, halo metal-poor stars, globular clusters, open clusters, the Large Magellanic Cloud and the Fornax dwarf spheroidal galaxy. [...] We find that, especially at low metallicity, the production of r-process elements is likely to be associated with the production of alpha-elements. This may support the core-collapse supernovae as the r-process site. We also verify the over-abundances of light s-process elements at low metallicity, and find that the relative contribution decreases at higher metallicity, which suggests that this lighter elements primary process may be associated with massive stars. [...] Our analysis reveals two types of core-collapse supernovae: one produces mainly alpha-e...

  12. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  13. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  14. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  15. Compositions and chemical bonding in ceramics by quantitative electron energy-loss spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, J.; Horton, L.L. [Oak Ridge National Lab., TN (United States); McHargue, C.J. [Tennessee Univ., Knoxville, TN (United States); McKernan, S.; Carter, C.B. [Minnesota Univ., Minneapolis, MN (United States). Dept. of Chemical Engineering; Revcolevschi, A. [Univ. de Paris-Sud, Lab. de Chemie des Solides (France); Tanaka, S.; Davis, R.F. [North Carolina State Univ., Raleigh, NC (United States). Dept. of Materials Science and Engineering

    1993-12-31

    Quantitative electron energy-loss spectrometry was applied to a range of ceramic materials at a spatial resolution of <5 nm. Analysis of Fe L{sub 23} white lines indicated a low-spin state with a charge transfer of {approximately}1.5 electrons/atom onto the Fe atoms implanted into (amorphized) silicon carbide. Gradients of 2 to 5% in the Co:O stoichiometry were measured across 100-nm-thick Co{sub 3}O{sub 4} layers in an oxidized directionally solidified CoO-ZrO{sub 2} eutectic, with the highest O levels near the ZrO{sub 2}. The energy-loss near-edge structures were dramatically different for the two cobalt oxides; those for CO{sub 3}O{sub 4} have been incorrectly ascribed to CoO in the published literature. Kinetically stabilized solid solubility occurred in an AlN-SiC film grown by low-temperature molecular beam epitaxy (MBE) on {alpha}(6H)-SiC, and no detectable interdiffusion occurred in couples of MBE-grown AlN on SiC following annealing at up to 1750C. In diffusion couples of polycrystalline AlN on SiC, interfacial 8H sialon (aluminum oxy-nitride) and pockets of Si{sub 3}N{sub 4}-rich {beta}{prime} sialon in the SiC were detected.

  16. Advanced development in chemical analysis of Cordyceps.

    Science.gov (United States)

    Zhao, J; Xie, J; Wang, L Y; Li, S P

    2014-01-01

    Cordyceps sinensis, also called DongChongXiaCao (winter worm summer grass) in Chinese, is a well-known and valued traditional Chinese medicine. In 2006, we wrote a review for discussing the markers and analytical methods in quality control of Cordyceps (J. Pharm. Biomed. Anal. 41 (2006) 1571-1584). Since then this review has been cited by others for more than 60 times, which suggested that scientists have great interest in this special herbal material. Actually, the number of publications related to Cordyceps after 2006 is about 2-fold of that in two decades before 2006 according to the data from Web of Science. Therefore, it is necessary to review and discuss the advanced development in chemical analysis of Cordyceps since then. PMID:23688494

  17. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  18. Qualitative and Quantitative Analysis of Andrographis paniculata by Rapid Resolution Liquid Chromatography/Time-of-Flight Mass Spectrometry

    OpenAIRE

    Jian-Fei Qin; Zhi-Yuan Jiang; Zhao Jin; Shi-Ping Liu; Yong-Xi Song

    2013-01-01

    A rapid resolution liquid chromatography/time-of-flight tandem mass spectrometry (RRLC-TOF/MS) method was developed for qualitative and quantitative analysis of the major chemical constituents in Andrographis paniculata. Fifteen compounds, including flavonoids and diterpenoid lactones, were unambiguously or tentatively identified in 10 min by comparing their retention times and accurate masses with standards or literature data. The characteristic fragmentation patterns of flavonoids and diter...

  19. Quantitative structure-property relationships for chemical functional use and weight fractions in consumer articles

    Science.gov (United States)

    Chemical functional use -- the functional role a chemical plays in processes or products -- may be a useful heuristic for predicting human exposure potential in that it comprises information about the compound's likely physical properties and the product formulations or articles ...

  20. Análise química quantitativa para a padronização do óleo de copaíba por cromatografia em fase gasosa de alta resolução Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatograpy

    OpenAIRE

    Marcelo R. R. Tappin; Jislaine F. G. Pereira; Lucilene A. Lima; Siani, Antonio C.; José L. Mazzei; Mônica F. S. Ramos

    2004-01-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization w...

  1. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  2. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  3. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  4. Quantitative phase imaging applied to laser damage detection and analysis.

    Science.gov (United States)

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  5. Quantitative analysis of iron oxides using Fourier transform infrared spectrophotometry

    International Nuclear Information System (INIS)

    In this study, a systematic approach based on the application of Fourier transform infrared spectrophotometry (FTIR) was taken, in order to quantitatively analyze the corrosion products formed in the secondary cycle of pressurized water reactors (PWR). Binary mixtures of iron oxides were prepared with known compositions containing pure commercial magnetite (Fe3O4), maghemite (γ-Fe2O3), and hematite (α-Fe2O3) for calibration purposes. Calcium oxide (lime) was added to all samples as a standard reference in obtaining the calibration curves. Using regression analysis, relationships were developed for intensity versus concentration for absorption bands corresponding to each of the phases in their corresponding FTIR spectrum. Correlation coefficients, R2, of 0.82, 0.87, and 0.86 were obtained for maghemite-magnetite, hematite-magnetite, and hematite-maghemite systems, respectively. The calibration curves generated were used to quantify phases in multi-component unknown field samples that were obtained from different components (moisture separators, condensers, and high- and low- pressure heaters) of the two units (units 1 and 2) of the secondary cycle of the Comanche Peak PWR

  6. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. O. Domanov

    2015-09-01

    Full Text Available Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial characteristics and citizens’ identities. As the geographical position at the border of Russia provides the citizens with geopolitical alternatives to identify their location as a fortress defending the nation (as in the case of Kronstadt or a bridge between cultures, the given study allows us to compare reasons for these geopolitical choices of inhabitants. Furthermore, the research aims at bridging the gap in the studies of European and multiple identity in Russian regions and provides Northwest Russian perspective on the perpetual discussion about subjective Eastern border of Europe.

  7. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 104, 2 x 104, 2 x 105 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  8. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  9. Quantitative Analysis of AGV System in FMS Cell Layout

    Directory of Open Access Journals (Sweden)

    B. Ramana

    1997-01-01

    Full Text Available Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs are invariably used for material handling in flexible manufacturing Systems (FMSs due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system, such as AGV can be improved by reducing the length of empty move. The length of empty move of AGV depends upon despatching and scheduling methods. If these methods of AGVs are not properly planned, the length of empty move of AGV is greater than the length of delivery .This results in increase in material handling time which in turn increases the number of AGVs required in FMS cell. This paper presents a method for optimising the length of empty travel of AGV in a typical FMS cell layout.

  10. Quantitative analysis of piperine in ayurvedic formulation by UV Spectrophotometry

    Directory of Open Access Journals (Sweden)

    Gupta Vishvnath

    2011-02-01

    Full Text Available A simple and reproducible UV- spectrophotometric method for the quantitative determination of piperine in Sitopaladi churna (STPLC were developed and validated in the present work. The parameters linearity, precision , accuracy, and standard error were studies according to indian herbal pharmacopiea. In this present study a new, simple, rapid, sensitive, precise and economic spectrophotometric method in ultraviolet region has been developed for the determination of piperine in market and laboratory herbal formulation of Sitopaladi churna. which were procured and purchased respectively from the local market and they were evaluated as per Indian herbal Pharmacopoeia and WHO guidelines. The concentration of piperine present in raw material of PSC was found to be 1.45±0.014 w/w in piper longum fruits. Piperine has the maximum wavelength at 342.5 nm and hence the UV spectrophotometric method was performed at 342.5 nm. The samples were prepared in methanol and methos obeys Beers law in concentration ranges employed for evaluation. The content of piperine in ayurvedic formulation was determined. The result of analysis have been validated statistically and recovery studies confirmed the accuracy of the proposed method. Hence the proposed method can be used for the reliable quantification of Piperine in crude drug and its herbal formulation.

  11. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  12. Quantitatively integrating molecular structure and bioactivity profile evidence into drug-target relationship analysis

    Directory of Open Access Journals (Sweden)

    Xu Tianlei

    2012-05-01

    Full Text Available Abstract Background Public resources of chemical compound are in a rapid growth both in quantity and the types of data-representation. To comprehensively understand the relationship between the intrinsic features of chemical compounds and protein targets is an essential task to evaluate potential protein-binding function for virtual drug screening. In previous studies, correlations were proposed between bioactivity profiles and target networks, especially when chemical structures were similar. With the lack of effective quantitative methods to uncover such correlation, it is demanding and necessary for us to integrate the information from multiple data sources to produce an comprehensive assessment of the similarity between small molecules, as well as quantitatively uncover the relationship between compounds and their targets by such integrated schema. Results In this study a multi-view based clustering algorithm was introduced to quantitatively integrate compound similarity from both bioactivity profiles and structural fingerprints. Firstly, a hierarchy clustering was performed with the fused similarity on 37 compounds curated from PubChem. Compared to clustering in a single view, the overall common target number within fused classes has been improved by using the integrated similarity, which indicated that the present multi-view based clustering is more efficient by successfully identifying clusters with its members sharing more number of common targets. Analysis in certain classes reveals that mutual complement of the two views for compound description helps to discover missing similar compound when only single view was applied. Then, a large-scale drug virtual screen was performed on 1267 compounds curated from Connectivity Map (CMap dataset based on the fused similarity, which obtained a better ranking result compared to that of single-view. These comprehensive tests indicated that by combining different data representations; an improved

  13. Los Alamos National Laboratory Center for direct chemical analysis of materials

    International Nuclear Information System (INIS)

    The Center for Direct Chemical Analysis at Los Alamos National Laboratory is undertaking a major effort to develop, improve, and implement direct analysis techniques for radionuclide, organic, and inorganic constituents. The Center consists of a multidisciplinary team of researchers who possess expertise in the quantitative and qualitative characterization of solid materials using a variety of analytical technologies. Materials include soils and sludges, building materials, foods, chemicals, and atmospheric gases. Direct chemical analysis techniques measure the analytes directly in the solid material with minimal sample pretreatment, whereas conventional techniques, such as atomic absorption and emission spectrochemistry, require that the solid materials be rendered in aqueous solution using concentrated acids prior to measurement. Direct chemical analysis completely bypasses the digestion process, thereby increasing the sample throughout and saving both time and money. Direct chemical analysis is unique in that it alone can conduct certain specialized but highly useful types of analysis, such as depth-profiling and the chemical structural characterization of surfaces. In addition, some direct analytical techniques eliminate the sampling step and permit rapid analysis of samples at the point of origin. Direct analysis in situ would further reduce costs and potential hazards related to sample collection and transport to the analytical laboratory

  14. Chemical abundance analysis of 19 barium stars

    Science.gov (United States)

    Yang, Guo-Chao; Liang, Yan-Chun; Spite, Monique; Chen, Yu-Qin; Zhao, Gang; Zhang, Bo; Liu, Guo-Qing; Liu, Yu-Juan; Liu, Nian; Deng, Li-Cai; Spite, Francois; Hill, Vanessa; Zhang, Cai-Xia

    2016-01-01

    We aim at deriving accurate atmospheric parameters and chemical abundances of 19 barium (Ba) stars, including both strong and mild Ba stars, based on the high signal-to-noise ratio and high resolution Echelle spectra obtained from the 2.16 m telescope at Xinglong station of National Astronomical Observatories, Chinese Academy of Sciences. The chemical abundances of the sample stars were obtained from an LTE, plane-parallel and line-blanketed atmospheric model by inputting the atmospheric parameters (effective temperatures Teff, surface gravities log g, metallicity [Fe/H] and microturbulence velocity ξt) and equivalent widths of stellar absorption lines. These samples of Ba stars are giants as indicated by atmospheric parameters, metallicities and kinematic analysis about UVW velocity. Chemical abundances of 17 elements were obtained for these Ba stars. Their Na, Al, α- and iron-peak elements (O, Na, Mg, Al, Si, Ca, Sc, Ti, V, Cr, Mn, Ni) are similar to the solar abundances. Our samples of Ba stars show obvious overabundances of neutron-capture (n-capture) process elements relative to the Sun. Their median abundances of [Ba/Fe], [La/Fe] and [Eu/Fe] are 0.54, 0.65 and 0.40, respectively. The Y I and Zr I abundances are lower than Ba, La and Eu, but higher than the α- and iron-peak elements for the strong Ba stars and similar to the iron-peak elements for the mild stars. There exists a positive correlation between Ba intensity and [Ba/Fe]. For the n-capture elements (Y, Zr, Ba, La), there is an anti-correlation between their [X/Fe] and [Fe/H]. We identify nine of our sample stars as strong Ba stars with [Ba/Fe] >0.6 where seven of them have Ba intensity Ba=2-5, one has Ba=1.5 and another one has Ba=1.0. The remaining ten stars are classified as mild Ba stars with 0.17<[Ba/Fe] <0.54.

  15. Quantitative analysis of radiation-induced DNA deoxyribose oxidation products

    International Nuclear Information System (INIS)

    Deoxyribose oxidation plays an important role in the chemistry and biology of radical-mediated DNA damage beyond the simple interruption of the DNA backbone, including involvement in complex DNA lesions, cross-linking with DNA repair proteins and the formation of endogenous DNA adducts. This is illustrated by our discovery that 3'-phosphoglycolaldehyde residues, arising from 3'-oxidation of deoxyribose in DNA, form glyoxal and the glyoxal adduct of dG. Our research is driven by the lack of information about the spectrum and quantity of deoxyribose lesions in isolated DNA, human cells and tissues. This problem is compounded by the fact that oxidation of each of the five possible positions in deoxyribose can generate several unique damage products, most of which are toxic to cells. To this end, we have developed a sensitive GC/MS method to identify and quantify virtually all deoxyribose oxidation products in isolated DNA and in cells exposed to oxidizing agents under biological conditions. This method was applied to quantify 3'-phosphoglycolaldehyde residues in DNA oxidized by Fe-EDTA, gamma-radiation and alpha-particles with a detection limit of 30 femtomoles/sample corresponding to two phosphoglycolaldehyde molecules in 10?6 nucleotides for a 170 μg DNA sample. A 13C2 - labeled phosphoglycolaldehyde was used as internal standard. The method was verified by analysis of a synthetic, phosphoglycolaldehyde-containing oligonucleotide. It is widely believed that Fe-EDTA and gamma-radiation induce DNA damage by the formation of hydroxyl radicals and therefore we expected to see similar efficiencies in phosphoglycolaldehyde formation. However, the results reveal large differences in the efficiency of phosphoglycolaldehyde formation by these oxidants and suggest weaknesses in models relating DNA structure to chemical reactivity of DNA. An understanding of the relative quantities of various deoxyribose oxidation products will provide important insights into the basic

  16. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  17. Modern quantitative microstructure analysis on the example of aicu5mg1 alloys

    Directory of Open Access Journals (Sweden)

    Zlatičanin Biljana V.

    2002-01-01

    Full Text Available Using an automatic, QUANTIMET 500 MC, device for quantitative picture analysis and applying linear method of measurement on the example of AlCu5Mg1 alloys, the grain size (min, max and medium values, as well as relative standard measuring errors (RSE, dendrite arm spacing (DAS and length eutectic (Le and also distribution by size (histogram and volume participation of -hard solution and eutectic have been determined. We have also studied the influence of grain-refining additives AlTi5B1 for the same chemical composition of the aluminium-capper-magnesium alloy. It has been concluded that with the increase of titanium content, the mean value of grain size decreases. We have also examined hardness and pressure strength.

  18. Full quantitative phase analysis of hydrated lime using the Rietveld method

    International Nuclear Information System (INIS)

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2–15 wt.%.

  19. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  20. Compound quantitative ultrasonic tomography of long bones using wavelets analysis

    OpenAIRE

    Lasaygues, Philippe

    2005-01-01

    Compound Quantitative Ultrasonic Tomography (CQUT) is used to long bones imaging. In previous works, we showed that an iterative tool might be used to provide, from reflection tomography, qualitative images of the shape of the object, and to provide, from transmission tomography, quantitative images of the velocity map. Both tomographies are based on ultrasonic propagation in bones, particularly perturbed by this high-contrasted heterogeneous medium. Reflected and transmitted signal are compo...

  1. Quantitative Exposure Assessment of Various Chemical Substances in a Wafer Fabrication Industry Facility

    OpenAIRE

    Park, Hyunhee; Jang, Jae-Kil; Shin, Jung-Ah

    2011-01-01

    Objectives This study was designed to evaluate exposure levels of various chemicals used in wafer fabrication product lines in the semiconductor industry where work-related leukemia has occurred. Methods The research focused on 9 representative wafer fabrication bays among a total of 25 bays in a semiconductor product line. We monitored the chemical substances categorized as human carcinogens with respect to leukemia as well as harmful chemicals used in the bays and substances with hematologi...

  2. The potential of computer-based quantitative structure activity approaches for predicting acute toxicity of chemicals

    OpenAIRE

    Zvinavashe, E.

    2008-01-01

    Within the EU, the management of the risks of chemicals currently falls under a new legislation called Registration, Evaluation, and Authorization of Chemicals (REACH). Within the next 10 years, existing (eco)toxicological data gaps for the more than 100 000 chemicals on the European Inventory of Existing Commercial Substances (EINECS) should be filled. The challenge is to provide this toxicity information in a fast, cost effective manner, avoiding the use of experimental animals as much as p...

  3. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  4. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  5. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    Science.gov (United States)

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs ofAedes aegyptimosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  6. Significance analysis of microarray for relative quantitation of LC/MS data in proteomics

    OpenAIRE

    Li Qingbo; Roxas Bryan AP

    2008-01-01

    Abstract Background Although fold change is a commonly used criterion in quantitative proteomics for differentiating regulated proteins, it does not provide an estimation of false positive and false negative rates that is often desirable in a large-scale quantitative proteomic analysis. We explore the possibility of applying the Significance Analysis of Microarray (SAM) method (PNAS 98:5116-5121) to a differential proteomics problem of two samples with replicates. The quantitative proteomic a...

  7. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  8. Chemical analysis of Argonne premium coal samples. Bulletin

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, C.A.

    1997-11-01

    Contents: The Chemical Analysis of Argonne Premium Coal Samples: An Introduction; Rehydration of Desiccated Argonne Premium Coal Samples; Determination of 62 Elements in 8 Argonne Premium Coal Ash Samples by Automated Semiquantitative Direct-Current Arc Atomic Emission Spectrography; Determination of 18 Elements in 5 Whole Argonne Premium Coal Samples by Quantitative Direct-Current Arc Atomic Emission Spectrography; Determination of Major and Trace Elements in Eight Argonne Premium Coal Samples (Ash and Whole Coal) by X-Ray Fluorescence Spectrometry; Determination of 29 Elements in 8 Argonne Premium Coal Samples by Instrumental Neutron Activation Analysis; Determination of Selected Elements in Coal Ash from Eight Argonne Premium Coal Samples by Atomic Absorption Spectrometry and Atomic Emission Spectrometry; Determination of 25 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Atomic Emission Spectrometry; Determination of 33 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Mass Spectrometry; Determination of Mercury and Selenium in Eight Argonne Premium Coal Samples by Cold-Vapor and Hydride-Generation Atomic Absorption Spectrometry; Determinaton of Carbon, Hydrogen, and Nitrogen in Eight Argonne Premium Coal Samples by Using a Gas Chromatographic Analyzer with a Thermal Conductivity Detector; and Compilation of Multitechnique Determinations of 51 Elements in 8 Argonne Premium Coal Samples.

  9. Vibrational spectroscopy and chemometrics for rapid, quantitative analysis of bitter acids in hops (Humulus lupulus).

    Science.gov (United States)

    Killeen, Daniel P; Andersen, David H; Beatson, Ron A; Gordon, Keith C; Perry, Nigel B

    2014-12-31

    Hops, Humulus lupulus, are grown worldwide for use in the brewing industry to impart characteristic flavor and aroma to finished beer. Breeders produce many varietal crosses with the aim of improving and diversifying commercial hops varieties. The large number of crosses critical to a successful breeding program imposes high demands on the supporting chemical analytical laboratories. With the aim of reducing the analysis time associated with hops breeding, quantitative partial least-squares regression (PLS-R) models have been produced, relating reference data acquired by the industrial standard HPLC and UV methods, to vibrational spectra of the same, chemically diverse hops sample set. These models, produced from rapidly acquired infrared (IR), near-infrared (NIR), and Raman spectra, were appraised using standard statistical metrics. Results demonstrated that all three spectroscopic methods could be used for screening hops for α-acid, total bitter acids, and cohumulone concentrations in powdered hops. Models generated from Raman and IR spectra also showed potential for use in screening hops varieties for xanthohumol concentrations. NIR analysis was performed using both a standard benchtop spectrometer and a portable NIR spectrometer, with comparable results obtained by both instruments. Finally, some important vibrational features of cohumulone, colupulone, and xanthohumol were assigned using DFT calculations, which allow more insightful interpretation of PLS-R latent variable plots. PMID:25485767

  10. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    Science.gov (United States)

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  11. Quantitative Analysis of Berberine in Processed Coptis by Near-Infrared Diffuse Reflectance Spectroscopy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong; XIE Yun-fei; SONG Feng-rui; LIU Zhi-qiang; CONG Qian; ZHAO Bing

    2008-01-01

    The near-infrared(NIR) diffuse reflectance spectroscopy was used to study the content of Berberine in the processed Coptis.The allocated proportions of Coptis to ginger,yellow liquor or Evodia rutaecarpa changed according to the results of orthogonal design as well as the temperature.For as withdrawing the full and effective information from the spectral data as possible,the spectral data was preprocessed through first derivative and muitiplicative scatter correction(MSC) according to the optimization results of different preprocessing methods.Firstly,the model was established by partial least squares(PLS); the coefficient of determination(R2) of the prediction was 0.839,the root mean squared error of prediction(RMSEP) was 0.1422,and the mean relative error(RME) was 0.0276.Secondly,for reducing the dimension and removing noise,the spectral variables were highly effectively compressed via the wavelet transformation(WT) technology and the Haar wavelet was selected to decompose the spectral signals.After the wavelet coefficients from WT were input into the artificial neural network(ANN) instead of the spectra signal,the quantitative analysis model of Berberine in processed Coptis was established.The R2 of the model was 0.9153,the RMSEP was 0.0444,and the RME was 0.0091.The values of appraisal index,namely R2,RMSECV,and RME,indicate that the generalization ability and prediction precision of ANN are superior to those of PLS.The overall results show that NIR spectroscopy combined with ANN can be efficiently utilized for the rapid and accurate analysis of routine chemical compositions in Coptis.Accordingly,the result can provide technical support for the further analysis of Berberine and other components in processed Coptis.Simultaneously,the research can also offer the foundation of quantitative analysis of other NIR application.

  12. Análise química quantitativa para a padronização do óleo de copaíba por cromatografia em fase gasosa de alta resolução Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatograpy

    Directory of Open Access Journals (Sweden)

    Marcelo R. R. Tappin

    2004-04-01

    Full Text Available Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(--caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively and relative standard deviation (< 3%. Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(--caryophyllene was statistically (P < 0.05 different. This method showed to be suitable for classification and quality control of commercial samples of the oils.

  13. Chemical sensors in water analysis - pH value, oxygen and chlorine. Chemische Sensoren in der Wasseranalytik - pH, Sauerstoff und Chlor

    Energy Technology Data Exchange (ETDEWEB)

    Straub, H.

    1993-02-01

    Chemical sensors are used in many sectors of industry as measuring devices providing quantitative information about various quantities. Depending upon the task at hand, amperometric, conductimetric, voltammetric, and potentiometric approaches are all used in water analysis. (orig.)

  14. Application of quantitative proteomics expression analysis using stable isotope labeling

    International Nuclear Information System (INIS)

    Quantitative protein expression profiling is a crucial part of proteomics and requires technique that are able to efficiently provide accurate, high-throughput and reproducible differential expression values for proteins in two or more biological samples. At present, stable isotope labeling is probably considered as one of the most accurate ways to relatively quantify protein expression levels and additionally stable isotope labeling may be directly combined to LC MS/MS approaches. In summary, this technique has its advantages in quantitative proteomics. The application and the latest progresses about this technique are discussed. (authors)

  15. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  16. Quantitative chemical tagging, stellar ages and the chemo-dynamical evolution of the Galactic disc

    CERN Document Server

    Mitschang, A W; Zucker, D B; Anguiano, B; Bensby, T; Feltzing, S

    2013-01-01

    The early science results from the new generation of high-resolution stellar spectroscopic surveys, such as GALAH and the Gaia-ESO survey, will represent major milestones in the quest to chemically tag the Galaxy. Yet this technique to reconstruct dispersed coeval stellar groups has remained largely untested until recently. We build on previous work that developed an empirical chemical tagging probability function, which describes the likelihood that two field stars are conatal, that is, they were formed in the same cluster environment. In this work we perform the first ever blind chemical tagging experiment, i.e., tagging stars with no known or otherwise discernable associations, on a sample of 714 disc field stars with a number of high quality high resolution homogeneous metal abundance measurements. We present evidence that chemical tagging of field stars does identify coeval groups of stars, yet these groups may not represent distinct formation sites, e.g. as in dissolved open clusters, as previously thou...

  17. A rapid fluorescence based method for the quantitative analysis of cell culture media photo-degradation.

    Science.gov (United States)

    Calvet, Amandine; Li, Boyan; Ryder, Alan G

    2014-01-01

    Cell culture media are very complex chemical mixtures that are one of the most important aspects in biopharmaceutical manufacturing. The complex composition of many media leads to materials that are inherently unstable and of particular concern, is media photo-damage which can adversely affect cell culture performance. This can be significant particularly with small scale transparent bioreactors and media containers are used for process development or research. Chromatographic and/or mass spectrometry based analyses are often time-consuming and expensive for routine high-throughput media analysis particularly during scale up or development processes. Fluorescence excitation-emission matrix (EEM) spectroscopy combined with multi-way chemometrics is a robust methodology applicable for the analysis of raw materials, media, and bioprocess broths. Here we demonstrate how EEM spectroscopy was used for the rapid, quantitative analysis of media degradation caused by ambient visible light exposure. The primary degradation pathways involve riboflavin (leading to the formation of lumichrome, LmC) which also causes photo-sensitised degradation of tryptophan, which was validated using high pressure liquid chromatography (HPLC) measurements. The use of PARallel FACtor analysis (PARAFAC), multivariate curve resolution (MCR), and N-way partial least squares (NPLS) enabled the rapid and easy monitoring of the compositional changes in tryptophan (Trp), tyrosine (Tyr), and riboflavin (Rf) concentration caused by ambient light exposure. Excellent agreement between HPLC and EEM methods was found for the change in Trp, Rf, and LmC concentrations. PMID:24356227

  18. Quantitative determination of terbutaline and orciprenaline in human plasma by gas chromatography/negative ion chemical ionization/mass spectrometry.

    Science.gov (United States)

    Leis, H J; Gleispach, H; Nitsche, V; Malle, E

    1990-06-01

    A method for the determination of unconjugated terbutaline and orciprenaline in human plasma is described. The assay is based on stable isotope dilution gas chromatography/negative ion chemical ionization/mass spectrometry. An inexpensive and rapid method for preparation of stable isotope labelled analogues as well as their use in quantitative gas chromatography/mass spectrometry is shown. A highly efficient sample work-up procedure with product recoveries of more than 95% is presented. The method developed permits quantitative measurement of terbutaline and orciprenaline in human plasma down to 100 pg ml-1, using 1 ml of sample. Plasma levels of terbutaline after oral administration of 5 mg of terbutaline sulphate were estimated. PMID:2357489

  19. Quantitative modeling of bioconcentration factors of carbonyl herbicides using multivariate image analysis.

    Science.gov (United States)

    Freitas, Mirlaine R; Barigye, Stephen J; Daré, Joyce K; Freitas, Matheus P

    2016-06-01

    The bioconcentration factor (BCF) is an important parameter used to estimate the propensity of chemicals to accumulate in aquatic organisms from the ambient environment. While simple regressions for estimating the BCF of chemical compounds from water solubility or the n-octanol/water partition coefficient have been proposed in the literature, these models do not always yield good correlations and more descriptive variables are required for better modeling of BCF data for a given series of organic pollutants, such as some herbicides. Thus, the logBCF values for a set of carbonyl herbicides comprising amide, urea, carbamate and thiocarbamate groups were quantitatively modeled using multivariate image analysis (MIA) descriptors, derived from colored image representations for chemical structures. The logBCF model was calibrated and vigorously validated (r(2) = 0.79, q(2) = 0.70 and rtest(2) = 0.81), providing a comprehensive three-parameter linear equation after variable selection (logBCF = 5.682 - 0.00233 × X9774 - 0.00070 × X813 - 0.00273 × X5144); the variables represent pixel coordinates in the multivariate image. Finally, chemical interpretation of the obtained models in terms of the structural characteristics responsible for the enhanced or reduced logBCF values was performed, providing key leads in the prospective development of more eco-friendly synthetic herbicides. PMID:26971171

  20. Qualitative and Quantitative Analysis of Lignan Constituents in Caulis Trachelospermi by HPLC-QTOF-MS and HPLC-UV

    OpenAIRE

    Xiao-Ting Liu; Xu-Guang Wang; Rui Xu; Fan-Hua Meng; Neng-Jiang Yu; Yi-Min Zhao

    2015-01-01

    A high-performance liquid chromatography coupled with quadrupole tandem time-of-flight mass (HPLC-QTOF-MS) and ultraviolet spectrometry (HPLC-UV) was established for simultaneous qualitative and quantitative analysis of the major chemical constituents in Caulis Trachelospermi, respectively. The analysis was performed on an Agilent Zorbax Eclipse Plus C18 column (4.6 mm × 150 mm, 5 μm) using a binary gradient system of water and methanol, with ultraviolet absorption at 230 nm. Based on high-re...

  1. The approach to risk analysis in three industries: nuclear power, space systems, and chemical process

    International Nuclear Information System (INIS)

    The aerospace, nuclear power, and chemical processing industries are providing much of the incentive for the development and application of advanced risk analysis techniques to engineered systems. Risk analysis must answer three basic questions: What can go wrong? How likely is it? and What are the consequences? The result of such analyses is not only a quantitative answer to the question of 'What is the risk', but, more importantly, a framework for intelligent and visible risk management. Because of the societal importance of the subject industries and the amount of risk analysis activity involved in each, it is interesting to look for commonalities, differences, and, hopefully, a basis for some standardization. Each industry has its strengths: the solid experience base of the chemical industry, the extensive qualification and testing procedures of the space industry, and the integrative and quantitative risk and reliability methodologies developed for the nuclear power industry. In particular, most advances in data handling, systems interaction modeling, and uncertainty analysis have come from the probabilistic risk assessment work in the nuclear safety field. In the final analysis, all three industries would greatly benefit from a more deliberate technology exchange program in the rapidly evolving discipline of quantitative risk analysis. (author)

  2. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt;

    The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  3. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  4. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Science.gov (United States)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  5. Quantitative mineralogical analysis of sandstones using x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Full text: X-ray diffraction has long been used as a definitive technique for mineral identification based on the measuring the internal atomic or crystal structures present in powdered rocks; soils and other mineral mixtures. Recent developments in data gathering and processing, however, have provided an improved basis for its use as a quantitative tool, determining not only the nature of the minerals but also the relative proportions of the different minerals present. The mineralogy of a series of sandstone samples from the Sydney and Bowen Basins of eastern Australia has been evaluated by X-ray diffraction (XRD) on a quantitative basis using the Australian-developed SIROQUANT data processing technique. Based on Rietveld principles, this technique generates a synthetic X-ray diffractogram by adjusting and combining full-profile patterns of minerals nominated as being present in the sample and interactively matches the synthetic diffractogram under operator instructions to the observed diffractogram of the sample being analysed. The individual mineral patterns may be refined in the process, to allow for variations in crystal structure of individual components or for factors such as preferred orientation in the sample mount. The resulting output provides mass percentages of the different minerals in the mixture, and an estimate of the error associated with each individual percentage determination. The chemical composition of the mineral mixtures indicated by SIROQUANT for each individual sandstone studied was estimated using a spreadsheet routine, and the indicated proportion of each oxide in each sample compared to the actual chemical analysis of the same sandstone as determined independently by X-ray fluorescence spectrometry. The results show a high level of agreement for all major chemical constituents, indicating consistency between the SIROQUANT XRD data and the whole-rock chemical composition. Supplementary testing with a synthetic corundum spike further

  6. Multivariate data analysis for depth resolved chemical classification and quantification of sulfur in SNMS

    Science.gov (United States)

    Sommer, M.; Goschnick, J.

    2005-09-01

    The quantification of elements in quadrupole based SNMS is hampered by superpositions of atomic and cluster signals. Moreover, the conventional SNMS data evaluation employs only atomic signals to determine elemental concentrations, which not allows any chemical specifications of the determined elements. Improvements in the elemental quantification and additional chemical information can be obtained from kinetic energy analysis and the inclusion of molecular signals into mass spectra evaluation. With the help of multivariate data analysis techniques, the combined information is used for the first time for a quantitative and chemically distinctive determination of sulfur. The kinetic energy analysis, used to solve the interference of sulfur with O 2 at masses 32-34 D, turned out to be highly important for the new type of evaluation.

  7. Semiquantitative and quantitative measurements for EDXRF in elemental chemical composition of pigments

    International Nuclear Information System (INIS)

    X-Ray fluorescence technique is largely used in the characterization of art and archaeological objects for restoration and conservation, allowing a multi-elemental, simultaneous and non destructive analysis. In this work it was used a portable XRF equipment of XRF that consists of a 238 Pu source ( 13,6 and 17,2 keV; 95 mCi) and a SI-PIN detector coupled to a 8 k multichannel analyser. The results were collected by a palmtop computer and later analysed in a PC, through the program AXIL-QXAS. The acquisition time for each measurement was 500 s. The measurements were accomplished in a wood sculpture (Santa Luzia image, number 164) from the collection of the Museu de Etnologia da Universidade de Sao Paulo (MAE-USP), in the following regions: (STL1) inferior side of the wood base exposed without finishing, (STL2) frontal inferior base of the pedestal (dark blue) (STL3), inferior part of the frontal dress (gold), (STL4) medium part of the dress (clear blue) (STL5) mantle (red), (STL6) back central lock of the hair in the backs (black), (STL7) right cheek (flesh-coloured) and (STL8) mantle (gold). The elements found in the STL1 region were: Al, Ca, Fe and a high concentration of Zn. In the region STL2 were found Al, C, Fe, Zn and the key element Cu. In the region STL3 - Ca, Zn and the key element Au. In the region STL4 it was found Zn and the key element Cu. In the region STL5 the key element S and Hg. In the region STL6 were found Fe, Ca, S and Hg. In the region STL 7 were found Al, Cu, Hg and Zn. In the region STL8 were found Ca, Al and Au, with high concentration. It was concluded that the possible pigments would be: STL2 and STL4 - CuCO3. Cu(OH)2 + ZnO; STL3 and STL8 - Au; STL5 - HgS, STL6 - HgS mixed with other oxides, possibly of Fe and Mn, and STL 7 - HgS + ZnO. Standard samples of wood painted with pigments of the colors white, blue, red, rose, flesh color and green were also made. Through the XRF method it was verified that the white pigment is TiO2, the red one

  8. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  9. Quantitative mutagenesis by chemicals and by radiations: prerequisites for the establishment of rad-equivalences

    International Nuclear Information System (INIS)

    The lesions produced in the genetic material by chemical mutagens, on the one hand, and radiations, on the other, are very similar. In both cases, they are either lesions in DNA or changes in the bonds between this DNA and the proteins which surround it. The lesions are sufficiently similar to elicit, in both cases, the activity of the same repair systems. The similarity between chemical and radiation induced mutagenesis can be demonstrated by checking that a strain which is hyper-sensitive to radiation because it lacks some repair system, is also hyper-sensitive to most chemical mutagens. These similarities between the lesions suggest that one can establish an equivalence between the 'dose' of a chemical and a dose of radiation, on the basis of the effects produced on some biological systems of reference. Once such equivalence has been established, one could extrapolate the rules of radiation protection to protection against that chemical. Is this principle applicable, and under which conditions. What prerequisites must be fulfilled. The goal of this paper is to answer these questions

  10. [Influence of ancient glass samples surface conditions on chemical composition analysis using portable XRF].

    Science.gov (United States)

    Liu, Song; Li, Qing-hui; Gan, Fu-xi

    2011-07-01

    Portable X-ray fluorescence analysis (PXRF) is one kind of surface analysis techniques, and the sample surface condition is an important factor that influences the quantitative analysis results. The ancient glass samples studied in the present paper were excavated from Xinjiang, Guangxi, Jiangsu provinces, and they belong to Na2O-CaO-SiO2, K2O-SiO2, and PbO-BaO-SiO2 system, respectively. Quantitative analysis results of weathered surface and inside of the ancient glass samples were compared. The concentration change of main fluxes in different parts of the samples was pointed out. Meanwhile, the authors studied the effect of distance between the sample and the reference plane, and curve shape of the sample on the quantitative results. The results obtained were calibrated by three methods, and the validity of these three methods was proved. Finally, the normalizing method was proved to be a better method for quantitative analysis of antiques. This paper also has guiding significance for chemical composition analysis of ancient jade samples using PXRF. PMID:21942060

  11. Comprehensive and Quantitative Profiling of the Human Sweat Submetabolome Using High-Performance Chemical Isotope Labeling LC-MS.

    Science.gov (United States)

    Hooton, Kevin; Han, Wei; Li, Liang

    2016-07-19

    Human sweat can be noninvasively collected and used as a media for diagnosis of certain diseases as well as for drug detection. However, because of very low concentrations of endogenous metabolites present in sweat, metabolomic analysis of sweat with high coverage is difficult, making it less widely used for metabolomics research. In this work, a high-performance method for profiling the human sweat submetabolome based on chemical isotope labeling (CIL) liquid chromatography-mass spectrometry (LC-MS) is reported. Sweat was collected using a gauze sponge style patch, extracted from the gauze by centrifugation, and then derivatized using CIL. Differential (12)C- and (13)C-dansylation labeling was used to target the amine/phenol submetabolome. Because of large variations in the total amount of sweat metabolites in individual samples, sample amount normalization was first performed using liquid chromatography with UV detection (LC-UV) after dansylation. The (12)C-labeled individual sample was then mixed with an equal amount of (13)C-labeled pooled sample. The mixture was subjected to LC-MS analysis. Over 2707 unique metabolites were detected across 54 sweat samples collected from six individuals with an average of 2002 ± 165 metabolites detected per sample from a total of 108 LC-MS runs. Using a dansyl standard library, we were able to identify 83 metabolites with high confidence; many of them have never been reported to be present in sweat. Using accurate mass search against human metabolome libraries, we putatively identified an additional 2411 metabolites. Uni- and multivariate analyses of these metabolites showed significant differences in the sweat submetabolomes between male and female, as well as between early and late exercise. These results demonstrate that the CIL LC-MS method described can be used to profile the human sweat submetabolome with high metabolomic coverage and high quantification accuracy to reveal metabolic differences in different sweat

  12. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    OpenAIRE

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a l...

  13. Research on Petroleum Reservoir Diagenesis and Damage Using EDS Quantitative Analysis Method With Standard Samples

    Institute of Scientific and Technical Information of China (English)

    包书景; 陈文学; 等

    2000-01-01

    In recent years,the X-ray spectrometer has been devekloped not only just in enhancing resolution,but also towards dynamic analysis.Computer modeling processing,sampled quantitative analysis and supra-light element analysis.With the gradual sophistication of the quantitative analysis system software,the rationality and accuracy of the established sample deferential document have become the most important guarantee to the reliability of sample quantitative analysis.This work is an important technical subject in China Petroleum Reservoir Research.Through two years of research and experimental work,the EDS quantitative analysis method for petroleum geolgey and resevoir research has been established.and referential documents for five mineral(silicate,etc).specimen standards have been compiled.Closely combining the shape characters and compositional characters of the minerals together and applying them into reservoir diagenetic research and prevention of oil formations from damage,we have obtained obvious geological effects.

  14. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    Science.gov (United States)

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  15. Truly quantitative analysis of the firefly luciferase complementation assay

    Directory of Open Access Journals (Sweden)

    Renee Dale

    2016-04-01

    Full Text Available Luciferase complementation assays detect protein-protein interactions within living cells using bioluminescence. Since the first report using plant cells was published in 2007, over 100 peer-reviewed articles have been published describing the detection of protein-protein interactions within plant cells by the assays. The assays have also been used to analyze networks of protein-protein interactions in plants. Although the assays have a high dynamic range, they remain qualitative with respect to determining the affinities of interactions. In this article, we first summarize the luciferase complementation assays developed in the past years. We then describe the mechanism of the firefly luciferase complementation that is most widely used in plants, and the reason it is qualitative rather than quantitative using a mathematical model. Finally, we discuss possible procedures to quantitatively determine the affinity of a protein pair using the firefly luciferase complementation assay.

  16. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  17. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  18. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    OpenAIRE

    Erin M Siegel; Riggs, Bridget M; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and ...

  19. Quantitative structural reliability assurance through finite element analysis

    OpenAIRE

    Rice, Christopher W.

    1998-01-01

    Approved for public release; distribution is unlimited. Risk assessment of aging aircraft components can be achieved by operational de-rating using a safety factor subjectively selected from experience and heuristics. This investigation involves synthesizing currently available, maturing computer-aided methods into a format of objective quantitative risk assessment. The methodology is applied to quantify the effect of corrosion on P-3C main landing gear lower drag struts. This kind of synt...

  20. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    OpenAIRE

    Aino eSalminen; K.A. Elisa Kopra; Kati eHyvärinen; Susanna ePaju; Päivi eMäntylä; Kåre eBuhlin; Nieminen, Markku S; Juha eSinisalo; Pirkko J Pussinen

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years) with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia...

  1. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    OpenAIRE

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pirkko J Pussinen

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromona...

  2. The Determinants of Geographic Concentration of Industry: A Quantitative Analysis

    OpenAIRE

    Zhu Wang; Daniel Yi Xu; Luis Cabral

    2012-01-01

    Taking the early U.S. automobile industry as an example, we evaluate two competing hypotheses on geographic concentration of industry: local externalities versus employee spinoffs. Our findings suggest that both forces contribute to industry agglomeration through their specific channels, and the spinoff effect can be viewed as a special form of local externalities. Calibrating our model to the quantitative pattern of industry evolution reveals that traditional local externalities are main dri...

  3. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    OpenAIRE

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individua...

  4. Rietveld quantitative phase analysis of Yeelimite-containing cements

    OpenAIRE

    Álvarez-Pinazo, Gema; Cuesta, Ana; García-Maté, Marta; Santacruz, Isabel; Losilla, Enrique R.; De la Torre, Ángeles G.; León-Reina, Laura; Aranda, Miguel A. G.

    2012-01-01

    Yeelimite-containing cements are attracting attention for their tailored properties. Calcium sulfoaluminate, CSA, cements have high contents of Yeelimite and they are used for special applications. Belite calcium sulfoaluminate, BCSA or sulfobelite, cements have high contents of belite and intermediate contents of Yeelimite, and they may become an alternative to OPC. Here, we report Rietveld quantitative phase analyses for three commercially available CSA clinkers, one CSA cement,...

  5. Scanning tunneling microscopy on rough surfaces: quantitative image analysis

    OpenAIRE

    Reiss, Günter; Bruckl, Hubert; Vancea, Johann; Lecheler, R.; Hastreiter, E.

    1991-01-01

    In this communication, the application of scanning tunneling microscopy (STM) for a quantitative evaluation of roughnesses and mean island sizes of polycrystalline thin films is discussed. Provided strong conditions concerning the resolution are satisfied, the results are in good agreement with standard techniques as, for example, transmission electron microscopy. Owing to its high resolution, STM can supply a better characterization of surfaces than established methods, especially concerning...

  6. Thermal Imaging of Nanostructures by Quantitative Optical Phase Analysis

    OpenAIRE

    Baffou, Guillaume; Bon, Pierre; Savatier, Julien; Polleux, Julien; Zhu, Min; Merlin, Marine; Rigneault, Herve; Monneret, Serge

    2012-01-01

    We introduce an optical microscopy technique aimed at characterizing the heat generation arising from nanostructures, in a comprehensive and quantitative manner. Namely, the technique permits (i) mapping the temperature distribution around the source of heat, (ii) mapping the heat power density delivered by the source, and (iii) retrieving the absolute absorption cross section of light-absorbing structures. The technique is based on the measure of the thermal-induced refractive index variatio...

  7. Quantitative analysis on electric dipole energy in Rashba band splitting

    OpenAIRE

    Jisook Hong; Jun-Won Rhim; Changyoung Kim; Seung Ryong Park; Ji Hoon Shim

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calc...

  8. Quantitative Analysis of Tumor Burden in Mouse Lung via MRI

    OpenAIRE

    Tidwell, Vanessa K.; Garbow, Joel R.; Krupnick, Alexander S.; Engelbach, John A.; Nehorai, Arye

    2011-01-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Pre-clinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metasta...

  9. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A., E-mail: clayton.pereira.silva@usp.b [Instituto de Pesquisas Energeticas e Nucleares (CQMA/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2011-07-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50{mu}g. g{sup -1} and uncertainty was shorter than 10% for the determined elements. (author)

  10. Simultaneous HPLC Quantitative Analysis of Nine Bioactive Constituents in Scirpus Yagara Ohwi. (Cyperaceae).

    Science.gov (United States)

    Zhang, Jianfang; Zou, Nuoshu; Liang, Qiaoli; Tang, Yamin; Duan, Jin'ao

    2016-03-01

    The tuber of Scirpus yagara Ohwi. (Cyperaceae) has long been used in traditional Chinese medicine (TCM). Several chemical constituents isolated from it possess a variety of physiologically activities such as anti-inflammatory, antitumor and antioxidant. A simultaneous high-performance liquid chromatography (HPLC) analysis was developed and validated for the determination of nine active components in tubers and aerial parts of S. yagara. The analysis was performed on a YMC-Pack ODS-A column (4.6 × 250 mm, 5 μm, 30 nm) with a multilinear gradient mobile phase of water-formic acid (100 : 0.2, v/v) and methanol. The established HPLC method was validated in terms of linearity, sensitivity, precision, accuracy, recovery and stability. All analyzed components were detected in the whole tested samples, and the contents of most components in the aerial parts were even higher than those in the tubers. Moreover, the best harvest period was discovered to be November, which is different from the traditional. The method developed was successfully applied for simultaneous qualitative and quantitative analysis of nine active components in S. yagara. PMID:26657411

  11. Quantitative XRD Analysis of Cement Clinker by the Multiphase Rietveld Method

    Institute of Scientific and Technical Information of China (English)

    HONG Han-lie; FU Zheng-yi; MIN Xin-min

    2003-01-01

    Quantitative phase analysis of Portland cement clinker samples was performed using an adaptation of the Rietveld method.The Rietveld quantitative analysis program,originally in Fortran 77 code,was significantly modified in visual basic code with windows 9X graph-user interface,which is free from the constraint of direct utilizable memory 640 k,and can be conveniently operated under the windows environment.The Rietveld quantitative method provides numerous advantages over conventional XRD quantitative method,especially in the intensity anomalies and superposition problems.Examples of its use are given with the results from other methods.It is concluded that,at present,the Rietveld method is the most suitable one for quantitative phase analysis of Portland cement clinker.

  12. Observations on the morphology and chemical analysis of medullary granules in chinchilla hair. Research letters

    Energy Technology Data Exchange (ETDEWEB)

    Keogh, H.J. (South African Inst. for Medical Research, Johannesburg); Haylett, T. (Council for Scientific and Industrial Research, Pretoria (South Africa). National Chemical Research Lab.)

    1983-02-01

    The ultrastructure of the medullary granules of white and grey chinchilla hair was investigated by scanning electron microscopy and chemical analysis in an attempt to clarify their structure and function. Atomic absorption spectroscopy and amino acid analysis showed them to be composed of melanin. The sample preparation for scanning electron microscopy is discussed. The metal content was qualitatively established by X-ray fluorescence spectrometry and quantitatively determined on a Varian Techtron model AAs atomic absorption spectrophotometer. Amino acid analysis of the granule, was carried out on a Beckman 121 amino acid analyser. Information is provided on the amino acid composition of the medullary granules as well as its metal content.

  13. Observations on the morphology and chemical analysis of medullary granules in chinchilla hair

    International Nuclear Information System (INIS)

    The ultrastructure of the medullary granules of white and grey chinchilla hair was investigated by scanning electron microscopy and chemical analysis in an attempt to clarify their structure and function. Atomic absorption spectroscopy and amino acid analysis showed them to be composed of melanin. The sample preparation for scanning electron microscopy is discussed. The metal content was qualitatively established by X-ray fluorescence spectrometry and quantitatively determined on a Varian Techtron model AAs atomic absorption spectrophotometer. Amino acid analysis of the granule, was carried out on a Beckman 121 amino acid analyser. Information is provided on the amino acid composition of the medullary granules as well as its metal content

  14. Quantitation of DNA methylation by melt curve analysis

    Directory of Open Access Journals (Sweden)

    Jones Michael E

    2009-04-01

    Full Text Available Abstract Background Methylation of DNA is a common mechanism for silencing genes, and aberrant methylation is increasingly being implicated in many diseases such as cancer. There is a need for robust, inexpensive methods to quantitate methylation across a region containing a number of CpGs. We describe and validate a rapid, in-tube method to quantitate DNA methylation using the melt data obtained following amplification of bisulfite modified DNA in a real-time thermocycler. Methods We first describe a mathematical method to normalise the raw fluorescence data generated by heating the amplified bisulfite modified DNA. From this normalised data the temperatures at which melting begins and finishes can be calculated, which reflect the less and more methylated template molecules present respectively. Also the T50, the temperature at which half the amplicons are melted, which represents the summative methylation of all the CpGs in the template mixture, can be calculated. These parameters describe the methylation characteristics of the region amplified in the original sample. Results For validation we used synthesized oligonucleotides and DNA from fresh cells and formalin fixed paraffin embedded tissue, each with known methylation. Using our quantitation we could distinguish between unmethylated, partially methylated and fully methylated oligonucleotides mixed in varying ratios. There was a linear relationship between T50 and the dilution of methylated into unmethylated DNA. We could quantitate the change in methylation over time in cell lines treated with the demethylating drug 5-aza-2'-deoxycytidine, and the differences in methylation associated with complete, clonal or no loss of MGMT expression in formalin fixed paraffin embedded tissues. Conclusion We have validated a rapid, simple in-tube method to quantify methylation which is robust and reproducible, utilizes easily designed primers and does not need proprietary algorithms or software. The

  15. Development of quantitative structure-activity relationship (QSAR) models to predict the carcinogenic potency of chemicals

    International Nuclear Information System (INIS)

    Determining the carcinogenicity and carcinogenic potency of new chemicals is both a labor-intensive and time-consuming process. In order to expedite the screening process, there is a need to identify alternative toxicity measures that may be used as surrogates for carcinogenic potency. Alternative toxicity measures for carcinogenic potency currently being used in the literature include lethal dose (dose that kills 50% of a study population [LD50]), lowest-observed-adverse-effect-level (LOAEL) and maximum tolerated dose (MTD). The purpose of this study was to investigate the correlation between tumor dose (TD50) and three alternative toxicity measures as an estimator of carcinogenic potency. A second aim of this study was to develop a Classification and Regression Tree (CART) between TD50 and estimated/experimental predictor variables to predict the carcinogenic potency of new chemicals. Rat TD50s of 590 structurally diverse chemicals were obtained from the Cancer Potency Database, and the three alternative toxicity measures considered in this study were estimated using TOPKAT, a toxicity estimation software. Though poor correlations were obtained between carcinogenic potency and the three alternative toxicity (both experimental and TOPKAT) measures for the CPDB chemicals, a CART developed using experimental data with no missing values as predictor variables provided reasonable estimates of TD50 for nine chemicals that were part of an external validation set. However, if experimental values for the three alternative measures, mutagenicity and logP are not available in the literature, then either the CART developed using missing experimental values or estimated values may be used for making a prediction

  16. An isotope-labeled chemical derivatization method for the quantitation of short-chain fatty acids in human feces by liquid chromatography–tandem mass spectrometry

    International Nuclear Information System (INIS)

    Highlights: • 3-Nitrophenylhydrazine was used to derivatize short-chain fatty acids (SCFAs) for LC-MS/MS. • 13C6 analogues were produced for use as isotope-labeled internal standards. • Isotope-labeled standards compensate for ESI matrix effects in LC-MS/MS. • Femtomolar sensitivities and 93–108% quantitation accuracy were achieved for human fecal SCFAs. - Abstract: Short-chain fatty acids (SCFAs) are produced by anaerobic gut microbiota in the large bowel. Qualitative and quantitative measurements of SCFAs in the intestinal tract and the fecal samples are important to understand the complex interplay between diet, gut microbiota and host metabolism homeostasis. To develop a new LC-MS/MS method for sensitive and reliable analysis of SCFAs in human fecal samples, 3-nitrophenylhydrazine (3NPH) was employed for pre-analytical derivatization to convert ten C2–C6 SCFAs to their 3-nitrophenylhydrazones under a single set of optimized reaction conditions and without the need of reaction quenching. The derivatives showed excellent in-solution chemical stability. They were separated on a reversed-phase C18 column and quantitated by negative-ion electrospray ionization – multiple-reaction monitoring (MRM)/MS. To achieve accurate quantitation, the stable isotope-labeled versions of the derivatives were synthesized in a single reaction vessel from 13C6-3NPH, and were used as internal standard to compensate for the matrix effects in ESI. Method validation showed on-column limits of detection and quantitation over the range from low to high femtomoles for the ten SCFAs, and the intra-day and inter-day precision for determination of nine of the ten SCFAs in human fecal samples was ≤8.8% (n = 6). The quantitation accuracy ranged from 93.1% to 108.4% (CVs ≤ 4.6%, n = 6). This method was used to determine the SCFA concentrations and compositions in six human fecal samples. One of the six samples, which was collected from a clinically diagnosed type 2 diabetes

  17. An isotope-labeled chemical derivatization method for the quantitation of short-chain fatty acids in human feces by liquid chromatography–tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Han, Jun; Lin, Karen; Sequeira, Carita [University of Victoria – Genome BC Proteomics Centre, University of Victoria, Vancouver Island Technology Park, 3101–4464 Markham Street, Victoria, BC V8Z 7X8 (Canada); Borchers, Christoph H., E-mail: christoph@proteincentre.com [University of Victoria – Genome BC Proteomics Centre, University of Victoria, Vancouver Island Technology Park, 3101–4464 Markham Street, Victoria, BC V8Z 7X8 (Canada); Department of Biochemistry and Microbiology, University of Victoria, Petch Building Room 207, 3800 Finnerty Road, Victoria, BC V8P 5C2 (Canada)

    2015-01-07

    Highlights: • 3-Nitrophenylhydrazine was used to derivatize short-chain fatty acids (SCFAs) for LC-MS/MS. • {sup 13}C{sub 6} analogues were produced for use as isotope-labeled internal standards. • Isotope-labeled standards compensate for ESI matrix effects in LC-MS/MS. • Femtomolar sensitivities and 93–108% quantitation accuracy were achieved for human fecal SCFAs. - Abstract: Short-chain fatty acids (SCFAs) are produced by anaerobic gut microbiota in the large bowel. Qualitative and quantitative measurements of SCFAs in the intestinal tract and the fecal samples are important to understand the complex interplay between diet, gut microbiota and host metabolism homeostasis. To develop a new LC-MS/MS method for sensitive and reliable analysis of SCFAs in human fecal samples, 3-nitrophenylhydrazine (3NPH) was employed for pre-analytical derivatization to convert ten C{sub 2}–C{sub 6} SCFAs to their 3-nitrophenylhydrazones under a single set of optimized reaction conditions and without the need of reaction quenching. The derivatives showed excellent in-solution chemical stability. They were separated on a reversed-phase C{sub 18} column and quantitated by negative-ion electrospray ionization – multiple-reaction monitoring (MRM)/MS. To achieve accurate quantitation, the stable isotope-labeled versions of the derivatives were synthesized in a single reaction vessel from {sup 13}C{sub 6}-3NPH, and were used as internal standard to compensate for the matrix effects in ESI. Method validation showed on-column limits of detection and quantitation over the range from low to high femtomoles for the ten SCFAs, and the intra-day and inter-day precision for determination of nine of the ten SCFAs in human fecal samples was ≤8.8% (n = 6). The quantitation accuracy ranged from 93.1% to 108.4% (CVs ≤ 4.6%, n = 6). This method was used to determine the SCFA concentrations and compositions in six human fecal samples. One of the six samples, which was collected from a

  18. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms. I

  19. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    Science.gov (United States)

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  20. 3D Nanoscale Imaging and Quantitative Analysis of Zeolite Catalysts

    NARCIS (Netherlands)

    Zecevic, J.

    2013-01-01

    Zeolites are crystalline microporous aluminosilicates, one of the most versatile and widely used class of materials.The unique physico-chemical properties of zeolites are found to be irreplaceable in many industrial processes such as separation, adsorption and catalysis. To exploit their full potent

  1. Qualitative and quantitative CT analysis of acute pulmonary failure

    International Nuclear Information System (INIS)

    Since its first application in patients with acute lung injury 25 years ago, computed tomography (CT) has significantly influenced the understanding of the pathophysiology, diagnosis and management of acute lung injury and has become an important diagnostic modality for these patients. The aim of this article is to review important disease-specific aspects of CT acquisition and qualitative and quantitative analyses of CT data. Morphological changes seen on CT and associated functional alterations are discussed. Methods used for the quantification of lung aeration are described and their limitations outlined. (orig.)

  2. Cross-bridge model of muscle contraction. Quantitative analysis.

    OpenAIRE

    Eisenberg, E.; Hill, T L; Chen, Y.

    1980-01-01

    We recently presented, in a qualitative manner, a cross-bridge model of muscle contraction which was based on a biochemical kinetic cycle for the actomyosin ATPase activity. This cross-bridge model consisted of two cross-bridge states detached from actin and two cross-bridge states attached to actin. In the present paper, we attempt to fit this model quantitatively to both biochemical and physiological data. We find that the resulting complete cross-bridge model is able to account reasonably ...

  3. Quantitative analysis of water heavy by NMR spectroscopy

    International Nuclear Information System (INIS)

    Nuclear Magnetic Resonance has been applied to a wide variety of quantitative problems. A typical example has been the determination of isotopic composition. In this paper two different analytical methods for the determination of water in deuterium oxide are described. The first one, employs acetonitril as an internal standard compound and in the second one calibration curve of signal integral curve versus amount of D2O is constructed. Both methods give results comparable to those of mass spectrometry of IR spectroscopy. (Author) 5 refs

  4. Quantitative analysis of cell-free DNA in ovarian cancer

    OpenAIRE

    Shao, Xuefeng; He, Yan; Ji, Min; Chen, Xiaofang; Qi, Jing; SHI, Wei; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ova...

  5. Effects of gamma rays and chemical mutagens on quantitative characteristics of hemp

    Energy Technology Data Exchange (ETDEWEB)

    Zhatov, A.I.

    1979-01-01

    Observations of hemp plants produced by seeds irradiated with gamma rays (1 or 15 kR) or treated with chemical mutagens (soaking in 0.01% ethylenimine for 18 h or in 0.05% ethylmethanesulfonate for 12 h) showed the appearance of both positive and negative hereditary characteristics. Combination of the wider spectrum of mutant plants with those from appropriate breeding practices could lead to the development of new varieties of hemp with superior commercial characteristics.

  6. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    Science.gov (United States)

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  7. Chemical and Environmental Technology.

    Science.gov (United States)

    Sheather, Harry

    The two-year curriculum in chemical technology presented in the document is designed to prepare high school graduates for technical positions in the chemical industry. Course outlines are given for general chemistry, chemical calculations, quantitative analysis, environmental chemistry, organic chemistry 1 and 2, instrumental analysis, and…

  8. Energy security in China: A quantitative analysis and policy implications

    International Nuclear Information System (INIS)

    This study aims to examine how China's energy security has changed over 30 years of reform and the opening period. It constructs a 4-As quantitative evaluation framework—the availability of energy resources, the applicability of technology, the acceptability by society, and the affordability of energy resources. The quantitative results show that China's energy security was at its best during the sixth FYP period (1981–1985), but then deteriorated until it hit higher levels between 1995 and 2005. However, it was still lower than the level reached during the sixth FYP period. During the eleventh FYP period (2006–2010), the energy security situation deteriorated again. Differences in policy priority over the study period appear to affect the country's energy security status. This study suggests that China needs to develop renewable energy resources on a large scale and pay more attention to emissions control to reverse the downward trend in energy security. - Highlights: • This study establishes a comprehensive and quantifiable energy security concept. • China's energy security situation appears not to improve over its reform period. • Domestic policies and reforms attributed to the energy security in China. • Policy implications of what China implemented and needs to implement are drawn

  9. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    Science.gov (United States)

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  10. Quantitative Analysis of Moisture Effect on Black Soil Reflectance

    Institute of Scientific and Technical Information of China (English)

    LIU Huan-Jun; ZHANG Yuan-Zhi; ZHANG Xin-Le; ZHANG Bai; SONG Kai-Shan; WANG Zong-Ming; TANG Na

    2009-01-01

    Several studies have demonstrated that soil reflectance decreases with increasing soil moisture content,or increases when the soil moisture reaches a certain content;however,there are few analyses on the quantitative relationship between soil reflectance and its moisture,especially in the case of black soils in northeast China.A new moisture adjusting method was developed to obtain soil reflectance with a smaller moisture interval to describe the quantitative relationship between soil reflectance and moisture.For the soil samples with moisture contents ranging from air-dry to saturated,the changes in soil reflectance with soil moisture can be depicted using a cubic equation.Both moisture threshold (MT) and moisture inflexion (MI) of soil reflectance can also be determined by the equation.When the moisture range was smaller than MT,soil reflectance can be simulated with a linear model.However,for samples with different soil organic matter (OM),the parameters of the linear model varied regularly with the OM content.Based on their relationship,the soil moisture can be estimated from soil reflectance in the black soil region.

  11. Quantitative analysis of Babesia ovis infection in sheep and ticks.

    Science.gov (United States)

    Erster, Oran; Roth, Asael; Wollkomirsky, Ricardo; Leibovich, Benjamin; Savitzky, Igor; Zamir, Shmuel; Molad, Thea; Shkap, Varda

    2016-05-15

    A quantitative PCR, based on the gene encoding Babesia ovis Surface Protein D (BoSPD) was developed and applied to investigate the presence of Babesia ovis (B. ovis) in its principal vector, the tick Rhipicephalus bursa (R. bursa), and in the ovine host. Quantification of B. ovis in experimentally-infected lambs showed a sharp increase in parasitemia 10-11days in blood-inoculated and adult tick-infested lambs, and 24days in a larvae-infested lamb. A gradual decrease of parasitemia was observed in the following months, with parasites detectable 6-12 months post-infection. Examination of the parasite load in adult R. bursa during the post-molting period using the quantitative PCR assay revealed a low parasite load during days 2-7 post-molting, followed by a sharp increase, until day 11, which corresponded to the completion of the pre-feeding period. The assay was then used to detect B. ovis in naturally-infected sheep and ticks. Examination of samples from 8 sheep and 2 goats from infected flocks detected B. ovis in both goats and in 7 out of the 8 sheep. Additionally, B. ovis was detected in 9 tick pools (5 ticks in each pool) and two individual ticks removed from sheep in infected flocks. PMID:27084469

  12. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  13. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  14. Quantitative magnetometry analysis and structural characterization of multisegmented cobalt–nickel nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Cantu-Valle, Jesus [Department of Physics and Astronomy, University of Texas at San Antonio, One UTSA Circle, San Antonio, TX 78249 (United States); Díaz Barriga-Castro, Enrique [Centro de Investigación de Ciencias Físico Matemáticas/Facultad de Ciencias Físico Matemáticas, Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Vega, Víctor; García, Javier [Departamento de Física, Universidad de Oviedo, Calvo Sotelo s/n, Oviedo 33007 (Spain); Mendoza-Reséndez, Raquel [Facultad de Ingeniería Mecánica y Eléctrica. Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Luna, Carlos [Centro de Investigación de Ciencias Físico Matemáticas/Facultad de Ciencias Físico Matemáticas, Universidad Autónoma de Nuevo León, Pedro de Alba s/n, San Nicolás de Los Garza, Nuevo León 66450 (Mexico); Manuel Prida, Víctor [Departamento de Física, Universidad de Oviedo, Calvo Sotelo s/n, Oviedo 33007 (Spain); and others

    2015-04-01

    Understanding and measuring the magnetic properties of an individual nanowire and their relationship with crystalline structure and geometry are of scientific and technological great interest. In this work, we report the localized study of the magnetic flux distribution and the undisturbed magnetization of a single ferromagnetic nanowire that poses a bar-code like structure using off-axis electron holography (EH) under Lorentz conditions. The nanowires were grown by template-assisted electrodeposition, using AAO templates. Electron holography allows the visualization of the magnetic flux distribution within and surroundings as well as its quantification. The magnetic analysis performed at individual nanowires was correlated with the chemical composition and crystalline orientation of the nanowires. - Highlights: • The structure-magnetic property relationship of CoNi nanowires is determined. • Off axis electron holography for the magnetic nanowires is used for the analysis. • The magnetization is quantitatively obtained from the retrieved phase images. • These results lead to a better comprehension of the magneto-crystalline phenomena.

  15. Quantitative magnetometry analysis and structural characterization of multisegmented cobalt–nickel nanowires

    International Nuclear Information System (INIS)

    Understanding and measuring the magnetic properties of an individual nanowire and their relationship with crystalline structure and geometry are of scientific and technological great interest. In this work, we report the localized study of the magnetic flux distribution and the undisturbed magnetization of a single ferromagnetic nanowire that poses a bar-code like structure using off-axis electron holography (EH) under Lorentz conditions. The nanowires were grown by template-assisted electrodeposition, using AAO templates. Electron holography allows the visualization of the magnetic flux distribution within and surroundings as well as its quantification. The magnetic analysis performed at individual nanowires was correlated with the chemical composition and crystalline orientation of the nanowires. - Highlights: • The structure-magnetic property relationship of CoNi nanowires is determined. • Off axis electron holography for the magnetic nanowires is used for the analysis. • The magnetization is quantitatively obtained from the retrieved phase images. • These results lead to a better comprehension of the magneto-crystalline phenomena

  16. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Science.gov (United States)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  17. 3D Nanoscale Imaging and Quantitative Analysis of Zeolite Catalysts

    OpenAIRE

    Zecevic, J.

    2013-01-01

    Zeolites are crystalline microporous aluminosilicates, one of the most versatile and widely used class of materials.The unique physico-chemical properties of zeolites are found to be irreplaceable in many industrial processes such as separation, adsorption and catalysis. To exploit their full potential and optimize their properties for specific applications, zeolites are often subjected to several post-synthesis modifications. The work presented in this thesis aims to provide a deeper underst...

  18. Bridging the gaps for global sustainable development: a quantitative analysis.

    Science.gov (United States)

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  19. Exploring the Decision Making Process in Statistical Data Analysis: A Qualitative Study of Quantitative Researchers

    OpenAIRE

    Ho, Timothy

    2015-01-01

    Quantitative data analysis is a cognitively demanding process. Inferences from quantitative analyses are often used to inform matters of public policy and to learn about social phenomena. However, as statistical analysis is typically conducted behind closed office doors, little is known about how analysts decide on the final statistical model that important policy decisions rely upon for determining the effectiveness of programs and policies. As social programming becomes increasingly reliant...

  20. Determination of correction coefficients for quantitative analysis by mass spectrometry. Application to uranium impurities analysis

    International Nuclear Information System (INIS)

    Some of basic principles in spark source mass spectrometry are recalled. It is shown how this method can lead to quantitative analysis when attention is paid to some theoretical aspects. A time constant relation being assumed between the analysed solid sample and the ionic beam it gives we determined experimental relative sensitivity factors for impurities in uranium matrix. Results being in fairly good agreement with: an unelaborate theory on ionization yield in spark-source use of theoretically obtained relative sensitivity factors in uranium matrix has been developed. (author)

  1. EDXRF for non-destructive chemical analysis

    International Nuclear Information System (INIS)

    One of the non-destructive methods used for the identification and verification of metals is by the energy-dispersive X-ray fluorescence (EDXRF) technique. EDXRF analysis provides several important advantages such as simultaneous determination of the elements present, enable to analyse a very wide concentration range, fast analysis with no sample preparation. The paper shows how this technique is developed and applied in the identification and verification of different grades of stainless steels and also precious metals analysis. (Author)

  2. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    Directory of Open Access Journals (Sweden)

    Xinsheng Peng

    2014-01-01

    Full Text Available A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8-triethylamine (50 : 50 : 0.1% with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y=10706x-2959 (R2=1.0. The average recovery is 101.7%; RSD=2.22%  (n=9. This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle.

  3. Copper in silicon: Quantitative analysis of internal and proximity gettering

    Energy Technology Data Exchange (ETDEWEB)

    McHugo, S.A. [Lawrence Berkeley National Lab., CA (United States); Flink, C.; Weber, E.R. [Univ. of California, Berkeley, CA (United States)] [and others

    1997-08-01

    The behavior of copper in the presence of a proximity gettering mechanism and a standard internal gettering mechanism in silicon was studied. He implantation-induced cavities in the near surface region were used as a proximity gettering mechanism and oxygen precipitates in the bulk of the material provided internal gettering sites. Moderate levels of copper contamination were introduced by ion implantation such that the copper was not supersaturated during the anneals, thus providing realistic copper contamination/gettering conditions. Copper concentrations at cavities and internal gettering sites were quantitatively measured after the annealings. In this manner, the gettering effectiveness of cavities was measured when in direct competition with internal gettering sites. The cavities were found to be the dominant gettering mechanism with only a small amount of copper gettered at the internal gettering sites. These results reveal the benefits of a segregation-type gettering mechanism for typical contamination conditions.

  4. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    Given a task of designing controller for mobile robots in swarms, one might wonder which distributed control paradigms should be selected. Until now, paradigms of robot controllers have been within either behaviour based control or neural network based control, which have been recognized as two...... mainstreams of controller design for mobile robots. However, in swarm robotics, it is not clear how to determine control paradigms. In this paper we study the two control paradigms with various experiments of swarm aggregation. First, we introduce the two control paradigms for mobile robots. Second, we...... describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two...

  5. Quantitative analysis on electric dipole energy in Rashba band splitting

    Science.gov (United States)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  6. Quantitative comparison of low radiation doses and doses of a genotoxic industrial chemical: Ethylene oxide

    International Nuclear Information System (INIS)

    It can be stated with confidence that there are a number of factors involved in the etiology of cancer, genotoxic ''pollution'' being one of them. Ionizing radiation is one of the factors involved, but the important role played by various chemical products must not be forgotten. Ethylene oxide (EO) is particularly noteworthy in this connection as an alkylating agent, a mutagen and probably also a carcinogen. This gas is used very widely in the chemical industry and also in cold sterilization and disinfection processes. Measurements of the atmospheric concentration of EO have been carried out systematically over short or long periods in four sterilization plants of different capacities. A work study was conducted on 27 persons exposed to the gas every day. In conjunction with atmospheric data and the rad-equivalence principle, the information obtained from the study was used to evaluate their annual occupational exposure, the level of which proved to be high. Biological surveillance of the subjects exposed offers a possible method of checking this evaluation and of monitoring personnel. The alkylation rate of various haemoglobin amino acids can be measured in this way, but here difficulties arise in collecting the necessary blood samples. (author)

  7. Combined Micro-chemical and Micro-structural Analysis of New Minerals Representing Extreme Conditions

    Science.gov (United States)

    Ma, C.; Tschauner, O. D.

    2015-12-01

    Recent improvements in micro-chemical analysis in combination with novel tools for micrometer-scale structural analysis of minerals from synchrotron X-ray diffraction open a pathway towards studies of mineral paragenesis that were previously not or barely accessible. Often mineral assemblies that represent extreme conditions also pose extreme challenges to analysis: very small size scale, complex matrix, minor amounts of material. Examples of such extreme, but also quite relevant environments are: a) High pressure shock-metamorphic minerals in meteorites and terrestrial impact sites, b) inclusions in diamonds from the deep mantle, c) ultrarefractory phases in Ca-Al-inlcusions from the solar nebula, d) presolar condensates. We show how a combination of synchrotron-based structural and semi-quantitative chemical techniques, with electron-microscopy based high-resolution imaging and fully quantitative chemical analysis and qualitative structural identification establish a powerful tool for discovery and characterization of important and interesting new minerals on micron- to submicron size scale.

  8. Wavelet prism decomposition analysis applied to CARS spectroscopy: a tool for accurate and quantitative extraction of resonant vibrational responses.

    Science.gov (United States)

    Kan, Yelena; Lensu, Lasse; Hehl, Gregor; Volkmer, Andreas; Vartiainen, Erik M

    2016-05-30

    We propose an approach, based on wavelet prism decomposition analysis, for correcting experimental artefacts in a coherent anti-Stokes Raman scattering (CARS) spectrum. This method allows estimating and eliminating a slowly varying modulation error function in the measured normalized CARS spectrum and yields a corrected CARS line-shape. The main advantage of the approach is that the spectral phase and amplitude corrections are avoided in the retrieved Raman line-shape spectrum, thus significantly simplifying the quantitative reconstruction of the sample's Raman response from a normalized CARS spectrum in the presence of experimental artefacts. Moreover, the approach obviates the need for assumptions about the modulation error distribution and the chemical composition of the specimens under study. The method is quantitatively validated on normalized CARS spectra recorded for equimolar aqueous solutions of D-fructose, D-glucose, and their disaccharide combination sucrose. PMID:27410113

  9. Review times in peer review: quantitative analysis of editorial workflows

    CERN Document Server

    Mrowinski, Maciej J; Fronczak, Piotr; Nedic, Olgica; Ausloos, Marcel

    2015-01-01

    We examine selected aspects of peer review and suggest possible improvements. To this end, we analyse a dataset containing information about 300 papers submitted to the Biochemistry and Biotechnology section of the Journal of the Serbian Chemical Society. After separating the peer review process into stages that each review has to go through, we use a weighted directed graph to describe it in a probabilistic manner and test the impact of some modifications of the editorial policy on the efficiency of the whole process.

  10. Chemical Diversity, Origin, and Analysis of Phycotoxins.

    Science.gov (United States)

    Rasmussen, Silas Anselm; Andersen, Aaron John Christian; Andersen, Nikolaj Gedsted; Nielsen, Kristian Fog; Hansen, Per Juel; Larsen, Thomas Ostenfeld

    2016-03-25

    Microalgae, particularly those from the lineage Dinoflagellata, are very well-known for their ability to produce phycotoxins that may accumulate in the marine food chain and eventually cause poisoning in humans. This includes toxins accumulating in shellfish, such as saxitoxin, okadaic acid, yessotoxins, azaspiracids, brevetoxins, and pinnatoxins. Other toxins, such as ciguatoxins and maitotoxins, accumulate in fish, where, as is the case for the latter compounds, they can be metabolized to even more toxic metabolites. On the other hand, much less is known about the chemical nature of compounds that are toxic to fish, the so-called ichthyotoxins. Despite numerous reports of algal blooms causing massive fish kills worldwide, only a few types of compounds, such as the karlotoxins, have been proven to be true ichthyotoxins. This review will highlight marine microalgae as the source of some of the most complex natural compounds known to mankind, with chemical structures that show no resemblance to what has been characterized from plants, fungi, or bacteria. In addition, it will summarize algal species known to be related to fish-killing blooms, but from which ichthyotoxins are yet to be characterized. PMID:26901085

  11. A quantitative model of water radiolysis and chemical production rates near radionuclide-containing solids

    International Nuclear Information System (INIS)

    We present a mathematical model that quantifies the rate of water radiolysis near radionuclide-containing solids. Our model incorporates the radioactivity of the solid along with the energies and attenuation properties for alpha (α), beta (β), and gamma (γ) radiation to calculate volume normalized dose rate profiles. In the model, these dose rate profiles are then used to calculate radiolytic hydrogen (H2) and hydrogen peroxide (H2O2) production rates as a function of distance from the solid–water interface. It expands on previous water radiolysis models by incorporating planar or cylindrical solid–water interfaces and by explicitly including γ radiation in dose rate calculations. To illustrate our model's utility, we quantify radiolytic H2 and H2O2 production rates surrounding spent nuclear fuel under different conditions (at 20 years and 1000 years of storage, as well as before and after barrier failure). These examples demonstrate the extent to which α, β and γ radiation contributes to total absorbed dose rate and radiolytic production rates. The different cases also illustrate how H2 and H2O2 yields depend on initial composition, shielding and age of the solid. In this way, the examples demonstrate the importance of including all three types of radiation in a general model of total radiolytic production rates. - Highlights: • Our model quantifies radiolytic chemical production near solid–water interfaces. • The model accounts for chemical production by α, β and γ radiation. • The model is applicable to both planar and curved surfaces. • Relative production by α, β and γ radiation strongly depends on solid composition. • We apply the model to young and old spent nuclear fuel, with and without cladding

  12. Quantitative topographic analysis as a guide to rover-based research on Mars

    Science.gov (United States)

    Palucis, M. C.; Dietrich, W. E.; Parker, T. J.; Sumner, D. Y.; Williams, R. M. E.; Hayes, A.; Mangold, N.; Lewis, K. W.

    2014-12-01

    Satellite imagery of Mars now provides remarkable topographic data, often better than that on Earth in many countries. For decades, researchers have identified landforms on Mars that indicated the presence of gullies, rivers, deltas, fans, and lakes, pointing to the presence of surface waters, and the apparent necessity of an active hydrologic cycle involving rain or snow. Quantitative topographic analysis has provided a means to estimate volumes of runoff, sediment transport rates, and peak flow discharges, first using orbital imagery alone and then using laser altimetery coverage and higher resolution HiRISE (1 m/px), CTX (20 m/px) and HRSC (50 m/px) topography. Our detailed topographic analysis of the Peace Vallis fan near the Curiosity rover landing site in Gale Crater (Mars) suggested that the fan entered into a pre-existing enclosed basin that would likely contain lake sediments; sedimentary, mineralogical, and chemical analysis of this region, now named Yellowknife Bay, later found this to be the case, though debate remains on the exact origin and history of the deposit. The rover is currently heading to a 5 km high sedimentary mound (Aeolis Mons) with mineral signatures hypothesized to be the result of planet-wide changes in climate. Topographic features on the mound, which correspond in elevation with other large depositional features around the crater, suggest that a succession of lakes developed post-Noachian. Within Gale, we are in a unique position to determine the extent at which topography can tell us the evolutionary history of a place on another planet, since our hypotheses can actually be tested as the Curiosity rover makes its ascent up Aeolis Mons. Along the rover's traverse, we propose based on the geomorphic record that the sediments being examined were water soaked, perhaps several times under deep lakes, and that the rover will cross shorelines that may not be well-preserved, but are worth searching for. A quantitative topographic analysis

  13. A CGE analysis for quantitative evaluation of electricity market changes

    International Nuclear Information System (INIS)

    Risk and uncertainty entailed by electricity industry privatization impose a heavy burden on the political determination. In this sense, ex ante analyses are important in order to investigate the economic effects of privatization or liberalization in the electricity industry. For the purpose of fulfilling these quantitative analyses, a novel approach is developed, incorporating a top-down and bottom-up model that takes into account economic effects and technological constraints simultaneously. This study also examines various counterfactual scenarios after Korean electricity industry reform through the integrated framework. Simulation results imply that authorities should prepare an improved regulatory system and policy measures such as forward contracts for industry reform, in order to promote competition in the distribution sector as well as the generation sector. -- Highlights: •A novel approach is proposed for incorporating a top-down and bottom-up model. •This study examines various counterfactual scenarios after Korean electricity industry reform. •An improved regulatory system and policy measures are required before the reform

  14. Quantitative analysis of ultrasound images for computer-aided diagnosis.

    Science.gov (United States)

    Wu, Jie Ying; Tuomi, Adam; Beland, Michael D; Konrad, Joseph; Glidden, David; Grand, David; Merck, Derek

    2016-01-01

    We propose an adaptable framework for analyzing ultrasound (US) images quantitatively to provide computer-aided diagnosis using machine learning. Our preliminary clinical targets are hepatic steatosis, adenomyosis, and craniosynostosis. For steatosis and adenomyosis, we collected US studies from 288 and 88 patients, respectively, as well as their biopsy or magnetic resonanceconfirmed diagnosis. Radiologists identified a region of interest (ROI) on each image. We filtered the US images for various texture responses and use the pixel intensity distribution within each ROI as feature parameterizations. Our craniosynostosis dataset consisted of 22 CT-confirmed cases and 22 age-matched controls. One physician manually measured the vectors from the center of the skull to the outer cortex at every 10 deg for each image and we used the principal directions as shape features for parameterization. These parameters and the known diagnosis were used to train classifiers. Testing with cross-validation, we obtained 72.74% accuracy and 0.71 area under receiver operating characteristics curve for steatosis ([Formula: see text]), 77.27% and 0.77 for adenomyosis ([Formula: see text]), and 88.63% and 0.89 for craniosynostosis ([Formula: see text]). Our framework is able to detect a variety of diseases with high accuracy. We hope to include it as a routinely available support system in the clinic. PMID:26835502

  15. Quantitative analysis of electroluminescence images from polymer solar cells

    Science.gov (United States)

    Seeland, Marco; Rösch, Roland; Hoppe, Harald

    2012-01-01

    We introduce the micro-diode-model (MDM) based on a discrete network of interconnected diodes, which allows for quantitative description of lateral electroluminescence emission images obtained from organic bulk heterojunction solar cells. Besides the distributed solar cell description, the equivalent circuit, respectively, network model considers interface and bulk resistances as well as the sheet resistance of the semitransparent electrode. The application of this model allows direct calculation of the lateral current and voltage distribution within the solar cell and thus accounts well for effects known as current crowding. In addition, network parameters such as internal resistances and the sheet-resistance of the higher resistive electrode can be determined. Furthermore, upon introduction of current sources the micro-diode-model also is able to describe and predict current-voltage characteristics for solar cell devices under illumination. The local nature of this description yields important conclusions concerning the geometry dependent performance and the validity of classical models and equivalent circuits describing thin film solar cells.

  16. Quantitative Dopant/Impurity Analysis for ICF Targets

    Science.gov (United States)

    Huang, Haibo; Nikroo, Abbas; Stephens, Richard; Eddinger, Samual; Xu, Hongwei; Chen, K. C.; Moreno, Kari

    2008-11-01

    We developed a number of new or improved metrology techniques to measure the spatial distributions of multiple elements in ICF ablator capsules to tight NIF specifications (0.5±0.1 at% Cu, 0.25±0.10 at% Ar, 0.4±0.4 at% O). The elements are either the graded dopants for shock timing, such as Cu in Be, or process-induced impurities, such as Ar and O. Their low concentration, high spatial variation and simultaneous presence make the measurement very difficult. We solved this metrology challenge by combining several techniques: Cu and Ar profiles can be nondestructively measured by operating Contact Radiography (CR) in a differential mode. The result, as well as the O profile, can be checked destructively by a quantitative Energy Dispersive Spectroscopy (EDS) method. Non-spatially resolved methods, such as absorption edge spectroscopy (and to a lesser accuracy, x-ray fluorescence) can calibrate the Ar and Cu measurement in EDS and CR. In addition, oxygen pick-up during mandrel removal can be validated by before-and-after CR and by density change. Use of all these methods gives multiple checks on the reported profiles.

  17. The Design of Everyday Hate: A Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Katherine Aumer-Ryan

    2007-12-01

    Full Text Available Throughout history artists, poets, and writers have been interested in the nature of hate. Scientists from a variety of disciplines have also attempted to unravel its mysteries. Yet in spite of abundant theorizing and research, most modern scholars still complain that little is known about this complex emotion. In this study, a new approach has been taken. Following Heider’s (1958 observation that scientists can often learn a great deal by exploring people’s “common-sense” or “naïve psychologies,” students at the University of Texas and participants from a number of Internet sites were interviewed regarding their perceptions of the nature of emotion. Using grounded theory and employing mixed-method analyses (qualitative and quantitative, four questions were explored: (1 What do people mean by hate? (2 Whom do they hate? (3 Why do people hate the people they do? (4 How do people attempt to deal with such feelings? From participants’ answers, a theory concerning everyday hate was generated.

  18. Quantitative analysis of cholesteatoma using high resolution computed tomography

    International Nuclear Information System (INIS)

    Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author)

  19. Quantitative analysis of in-air output ratio

    International Nuclear Information System (INIS)

    Output factor (Scp) is one of the important factors required to calculate monitor unit (MU), and is divided into two components: phantom scatter factor (Sp) and in-air output ratio (Sc). Generally, Sc for arbitrary fields are calculated using several methods based on Sc determined by the absorbed dose measurement for several square fields. However, there are calculation errors when the treatment field has a large aspect ratio and the opening of upper and lower collimator are exchanged. To determine Sc accurately, scattered photons from the treatment head and backscattered particles into the monitor chamber must be analyzed individually. In this report, a simulation model that agreed well with measured Sc was constructed and dose variation by scattered photons from the treatment head and by backscattered particles into the monitor chamber was analyzed quantitatively. The results showed that the contribution of scattered photons from the primary collimator was larger than that of the flattening filter, and backscattered particles were affected by not only the upper jaw but also the lower jaw. In future work, a new Sc determination algorism based on the result of this report will be proposed

  20. Quantitative analysis of synaptic release at the photoreceptor synapse.

    Science.gov (United States)

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B

    2010-05-19

    Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca(2+) and exhibits an unusually shallow dependence on presynaptic Ca(2+). To provide a quantitative description of the photoreceptor Ca(2+) sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca(2+)-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca(2+): exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca(2+) binding sites on the rod Ca(2+) sensor rather than the typical four or five. For most models, the on-rates for Ca(2+) binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca(2+) unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca(2+) sensor, slow Ca(2+) unbinding may support the fusion of vesicles located at a distance from Ca(2+) channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  1. Quantitative analysis of impact measurements using dynamic load cells

    Directory of Open Access Journals (Sweden)

    Brent J. Maranzano

    2016-03-01

    Full Text Available A mathematical model is used to estimate material properties from a short duration transient impact force measured by dropping spheres onto rectangular coupons fixed to a dynamic load cell. The contact stress between the dynamic load cell surface and the projectile are modeled using Hertzian contact mechanics. Due to the short impact time relative to the load cell dynamics, an additional Kelvin–Voigt element is included in the model to account for the finite response time of the piezoelectric crystal. Calculations with and without the Kelvin–Voigt element are compared to experimental data collected from combinations of polymeric spheres and polymeric and metallic surfaces. The results illustrate that the inclusion of the Kelvin–Voigt element qualitatively captures the post impact resonance and non-linear behavior of the load cell signal and quantitatively improves the estimation of the Young's elastic modulus and Poisson's ratio. Mathematically, the additional KV element couples one additional differential equation to the Hertzian spring-dashpot equation. The model can be numerically integrated in seconds using standard numerical techniques allowing for its use as a rapid technique for the estimation of material properties.

  2. Quantitative analysis of complexes in electron irradiated CZ silicon

    International Nuclear Information System (INIS)

    Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1x1015-1x1017 cm-3) and helium dose (5x1012-5x1013 cm-2) or electron dose (1x1015-1x1017 cm-2) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial grade with low electron dose can be detected clearly. Concentration of these complexes is estimated. It is clarified that the complex configuration and thermal behavior in low carbon and low dose samples is simple and almost confined within the individual complex family compared to those in high concentration and high dose samples. Well-established complex behavior in electron-irradiated sample is compared to that in He-irradiated samples, obtained by deep level transient spectroscopy (DLTS) or cathodoluminescence (CL), which had close relation to the Si power device performance

  3. Analysis of composition complicated binary mixture by quantitative SEC

    Institute of Scientific and Technical Information of China (English)

    Zhengnian CHEN; Hongfeng XIE; Hu YANG; Zhiliu WANG; Rongshi CHENG

    2008-01-01

    The analyses of the composition of a binary mixture composed of two kinds of industrial complicated materials have great importance for formulation in practice.The present paper provides a quantitative size exclusion chromatography (SEC) method based on the principle of absolute quantification of SEC to solve the problem. The conventional data treatment procedure for the differential refractive index (DRI) signal of SEC H(V) is improved first by dividing it with the injected sample weight and leads to a novel defined weight normalized distribution Hw(V) and its integral Iw(V). These two distributions reflect the response constant of the sample in addition to the conventional normalized distribution F(V). The difference of the average response constants of the composing components provides a sensitive method to compute the composition of their mixture from its Hw(V) or Iw(V). The method was applied to mixtures of a kind of industrial asphalt and paraffin diluents as an example, and successful results are obtained.

  4. Quantitative analysis of microstructure of carbon materials by HRTEM

    Institute of Scientific and Technical Information of China (English)

    YANG Jun-he; CHENG Shu-hui; WANG Xia; ZHANG Zhuo; LIU Xiao-rong; TANG Guo-hua

    2006-01-01

    The main object of the present research is to make a quantitative evaluation on the microstructure of carbon materials in terms of microcrystal. The digitized images acquired from finely pulverized carbon materials under HRTEM at a high magnification were processed by the image processing software so as to extract the fringes of (002) lattice of graphite crystal from the background image,and an FFT-IFFT filtering operation was performed followed by processes as binarization for the image and skeletonization for the fringes. A set of geometrical parameters including position,length and orientation was set up for every lattice fringe by calculating the binarized image. Then,the above obtained fringe parameters were put into an algorithm,which was especially developed for such fringe images so as to find fringes that could be regarded as those belonged to one single graphite microcrystal. The fringe was subjected sequentially to comparing procedures with every other fringe on aspects as parallelism,relative position and spacing,and the above comparisons were repeated till the last fringe. Eventually,the microcrystal size,its stacking number,and the distribution of the microcrystal in the whole sample,as well as other related structure information of such microcrystal in carbon materials were statistically calculated. Such microstructure information at nanometer level may contribute greatly to the interpretation of the properties of carbon materials and a better correlation with the same macrostructure.

  5. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Uhlig

    2015-01-01

    Full Text Available The astronauts on board the International Space Station (ISS are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs, as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions.

  6. Quantitative Safety and Security Analysis from a Communication Perspective

    OpenAIRE

    Boris Malinowsky; Hans-Peter Schwefel; Oliver Jung

    2015-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and...

  7. Reviewing peer review: a quantitative analysis of peer review

    OpenAIRE

    Casati, Fabio; Marchese, Maurizio; Mirylenka, Katsiaryna; Ragone, Azzurra

    2010-01-01

    In this paper we focus on the analysis of peer reviews and reviewers behavior in a number of different review processes. More specifically, we report on the development, definition and rationale of a theoretical model for peer review processes to support the identification of appropriate metrics to assess the processes main properties. We then apply the proposed model and analysis framework to data sets from conference evaluation processes and we discuss the results implications and their eve...

  8. Chemical equilibrium analysis of dry hydrogen combustion

    International Nuclear Information System (INIS)

    The present work is based on a thermo-chemical equilibrium model for studying the effect of combustion of hydrogen during postulated accident scenarios in nuclear reactor containments. This model is based on the method of element potentials which seeks to minimize the free energy of the system. The condition on internal energy balance is imposed as a constraint during the minimization process. Another simplified model purely based on the internal energy balance has also been implemented to investigate the isolated impact of free energy and the conditions under which it becomes dominant. The two models have been used to extract final pressures for a wide range of initial conditions and mixture compositions that are typically found during accident scenarios. In the absence of hydrogen combustion experimental data, such models will become important for laying down a first estimate on the possible outcomes. (author)

  9. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  10. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    Energy Technology Data Exchange (ETDEWEB)

    Chiu, Weihsueh A., E-mail: chiu.weihsueh@epa.gov [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington DC, 20460 (United States); Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington DC, 20460 (United States)

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes.

  11. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    International Nuclear Information System (INIS)

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes

  12. Quantitative analysis of PMR-15 polyimide resin by HPLC

    Science.gov (United States)

    Roberts, Gary D.; Lauver, Richard W.

    1987-01-01

    The concentration of individual components and of total solids of 50 wt pct PMR-15 resin solutions was determined using reverse-phase HPLC to within + or - 8 percent accuracy. Acid impurities, the major source of impurities in 3,3', 4,4'-benzophenonetetracarboxylic acid (BTDE), were eliminated by recrystallizing the BTDE prior to esterification. Triester formation was not a problem because of the high rate of esterification of the anhydride relative to that of the carboxylic acid. Aging of PMR-15 resin solutions resulted in gradual formation of the mononadimide and bisnadimide of 4,4'-methylenedianiline, with the BTDE concentration remaining constant. Similar chemical reactions occurred at a reduced rate in dried films of PMR-15 resin.

  13. Quantitative analysis of cesium in synthetic lithium molten salts

    International Nuclear Information System (INIS)

    An analytical technique for fission products in lithium molten salts of spent PWR (Pressurized Water Reactor) fuels has been studied for the establishment of optimum chemical engineering process and the evaluation of process material balance in developing Direct Oxide Reduction Process with lithium metal. As part of the basic research, synthetic dissolver solutions of lithium chloride containing trace amounts of fission product elements (La, Ce, Pr, Nd, Sm, Eu, Gd, Y, Cs, Ru, Rh, Pd, Mo, Zr, Cd, Ba, Sr, Te and Se) was prepared and used in establishing the selective separation technique of cesium from lithium chloride matrix using cation exchange chromatography. Its recovery was measured by flame atomic absorption spectrometry and the reliability of this technique was evaluate

  14. Quantitative data analysis with SPSS release 8 for Windows a guide for social scientists

    CERN Document Server

    Bryman, Alan

    2002-01-01

    The latest edition of this best-selling introduction to Quantitative Data Analysis through the use of a computer package has been completely updated to accommodate the needs of users of SPSS Release 8 for Windows. Like its predecessor, it provides a non-technical approach to quantitative data analysis and a user-friendly introduction to the widely used SPSS for Windows. It assumes no previous familiarity with either statistics or computing but takes the reader step-by-step through the techniques, reinforced by exercises for further practice. Techniques explained in Quantitative Data Analysis with SPSS Release 8 for Windows include: * correlation * simple and multiple regression * multivariate analysis of variance and covariance * factor analysis The book also covers issues such as sampling, statistical significance, conceptualization and measurement and the selection of appropriate tests. For further information or to download the book's datasets, please visit the webstite: http://www.routledge.com/textbooks/...

  15. Quantitative, chemical, and mineralogical characterization of flue gas desulfurization by-products.

    Science.gov (United States)

    Laperche, Valérie; Bigham, Jerry M

    2002-01-01

    The objective of this study was to demonstrate that simple fractionation and selective dissolution techniques can be used to provide detailed chemical and mineralogical analyses of flue gas desulfurization by-products. The material studied was a mine grout prepared as a 1:1 mixture (wt./wt.) of fly ash (FA) and filter cake (FC) with hydrated lime (50 g kg(-1)) added to improve handling. The hydrated lime was composed mostly of calcite (CaCO3), portlandite [Ca(OH)2], lime (CaO), and brucite [Mg(OH)2] (515, 321, 55, and 35 g kg(-1), respectively) and had low (hydrated lime. The FA contained both magnetic (222 g kg(-1)) and nonmagnetic (778 g kg(-1)) fractions. The former was composed mostly of hematite (Fe2O3), magnetite (Fe3O4), and glass (272, 293, and 287 g kg(-1), respectively), whereas the latter was enriched in glass, quartz, and mullite (Al6Si2O13) (515, 243, and 140 g kg(-1), respectively). Etching with 1% HF showed that 60 to 100% of trace elements were concentrated in the glass, although some metals (Co, Cr, and Mn) were clearly enriched in the magnetic phase. The aged grout contained 147 g kg(-1) ettringite [Ca6Al2(SO4)3(OH)12 x 26H2O] in addition to 314 g kg(-1) hannebachite and 537 g kg(-1) insoluble phases (mullite, quartz, hematite, magnetite, and glass). PMID:12026103

  16. Quantitative bacterial examination and chemical evaluation of Diet, Club, and Ice-cream Sodas, Soft Drinks

    International Nuclear Information System (INIS)

    Diet, club, and ice cream sodas are flavored soft drinks consumed throughout the world, especially in summer seasons. This study has been undertaken to monitor the bacterial and chemical contamination of these national and international branded drinks procured from local markets. The isolated coliforms and microbes were E. coli Salmonella spp, Klebsiella spp, Enterobacter spp, Shigella spp, and Bacillus cereus. Diet and club sodas were less contaminated with microorganisms than were ice-cream sodas. Fifteen trace and toxic elements were identified with an atomic absorption spectrophotometer following the improved ash digestion method. The values of Nickel (Ni), (0.15 mg/L), (Pb) (0.28mg/L), Cadmium (Cd) (0.13mg/L) and Aluminum (Al) (0.76 mg/L) were higher than the (WHO) recommended limits. The concentrations of (Na, Fe, Pb) and Chromium (Cr) were higher in club sodas than diet and ice-cream sodas and the concentrations of Calcium (Ca), (Mn) in ice-cream sodas were also higher than diet and club sodas. Overall, the ice-cream sodas did not conform to the (WHO) standards allowed for safe ingestion of micro- and macro-metals in various drinks. (author)

  17. Synchrotron radiation microprobe quantitative analysis method for biomedical specimens

    International Nuclear Information System (INIS)

    Relative changes of trace elemental content in biomedical specimens are obtained easily by means of synchrotron radiation X-ray fluorescence microprobe analysis (SXRFM). However, the accurate assignment of concentration on a g/g basis is difficult. Because it is necessary to know both the trace elemental content and the specimen mass in the irradiated volume simultaneously. the specimen mass is a function of the spatial position and can not be weighed. It is possible to measure the specimen mass indirectly by measuring the intensity of Compton scattered peak for normal XRF analysis using a X-ray tube with Mo anode, if the matrix was consisted of light elements and the specimen was a thin sample. The Compton peak is not presented in fluorescence spectrum for white light SXRFM analysis. The continuous background in the spectrum was resulted from the Compton scattering with a linear polarization X-ray source. Biomedical specimens for SXRFM analysis, for example biological section and human hair, are always a thin sample for high energy X-ray, and they consist of H,C,N and O etc. light elements, which implies a linear relationship between the specimen mass and the Compton scattering background in the high energy region of spectrum. By this way , it is possible to carry out measurement of concentration for SXRFM analysis

  18. Quantitative assessment of human motion using video motion analysis

    Science.gov (United States)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  19. Quantitative analysis of some volatile components in Mimusops elengi L.

    Directory of Open Access Journals (Sweden)

    Chantana Aromdee

    2009-08-01

    Full Text Available Dried pikul flower (Mimusops elengi L., Sapotaceae is used in many recipes of Thai traditional medicine i.e. cardiotonic and stomachic. In this study, fresh and dried pikul flowers were investigated. The odour of pikul flower, even when it was dried, is very strong and characteristic. The constituents of volatile oils in fresh and dried pikul flowers extracted by ether were analysed by gas chromatography-mass spectrometry. 2-Phenylethanol, 4-hydroxybenzenemethanol and cinnamyl alcohol were mainly found in fresh flower, 10.49, 8.69 and 6.17%, respectively. Whereas those mainly found in dried flowers were long chain carboxylic acid ester and (Z-9-octadecenoic acid, 5.37 and 4.71% of ether extract, respectively.An analytical method simultaneously determining benzyl alcohol, 2-phenylethanol and methyl paraben was developed by using the GC-FID method. The percent recoveries were 91.66, 104.59 and 105.28%, respectively. The intraday variations(% RSD were 7.22, 6.67 and 1.86%; and the interday variation were 3.12, 2.52 and 3.55%, respectively. Detection limits were 0.005, 0.014 and 0.001 ppm, and quantitation limits were 0.015, 0.048 and 0.003 ppm, respectively. Benzyl alcohol, 2-phenylethanol and methyl paraben content of dried flowers (9 samples from various drug stores in Thailand and one sample from China were 6.40-13.46, 17.57-196.57 and 27.35-355.53 ppm, respectively.

  20. Automated quantitative analysis of ventilation-perfusion lung scintigrams

    International Nuclear Information System (INIS)

    An automated computer analysis of ventilation (Kr-81m) and perfusion (Tc-99m) lung images has been devised that produces a graphical image of the distribution of ventilation and perfusion, and of ventilation-perfusion ratios. The analysis has overcome the following problems: the identification of the midline between two lungs and the lung boundaries, the exclusion of extrapulmonary radioactivity, the superimposition of lung images of different sizes, and the format for presentation of the data. Therefore, lung images of different sizes and shapes may be compared with each other. The analysis has been used to develop normal ranges from 55 volunteers. Comparison of younger and older age groups of men and women show small but significant differences in the distribution of ventilation and perfusion, but no differences in ventilation-perfusion ratios

  1. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  2. Virtual unfolding of light sheet fluorescence microscopy dataset for quantitative analysis of the mouse intestine

    Science.gov (United States)

    Candeo, Alessia; Sana, Ilenia; Ferrari, Eleonora; Maiuri, Luigi; D'Andrea, Cosimo; Valentini, Gianluca; Bassi, Andrea

    2016-05-01

    Light sheet fluorescence microscopy has proven to be a powerful tool to image fixed and chemically cleared samples, providing in depth and high resolution reconstructions of intact mouse organs. We applied light sheet microscopy to image the mouse intestine. We found that large portions of the sample can be readily visualized, assessing the organ status and highlighting the presence of regions with impaired morphology. Yet, three-dimensional (3-D) sectioning of the intestine leads to a large dataset that produces unnecessary storage and processing overload. We developed a routine that extracts the relevant information from a large image stack and provides quantitative analysis of the intestine morphology. This result was achieved by a three step procedure consisting of: (1) virtually unfold the 3-D reconstruction of the intestine; (2) observe it layer-by-layer; and (3) identify distinct villi and statistically analyze multiple samples belonging to different intestinal regions. Even if the procedure has been developed for the murine intestine, most of the underlying concepts have a general applicability.

  3. Morphologic characterization and quantitative analysis on in vitro bacteria by nuclear techniques of measurement

    International Nuclear Information System (INIS)

    The great difficulty to identify microorganisms (bacteria) from infectious processes is related to the necessary time to obtain a reliable result, about 72 hours. The purpose of this work is to establish a faster method to characterize bacterial morphologies through the use of neutron radiography, which can take about 5 hours. The samples containing the microorganisms, bacteria with different morphologies, after the appropriate microbiologic procedures were incubated with B10 for 30 minutes and soon after deposited in a plate of a solid detector of nuclear tracks (SSNTD), denominated CR-39. To obtain the images relative to bacteria, the detector was submitted to the flow of thermal neutrons of the order of 2.2 x 105 n/cm2.s from the J-9 channel of the Reactor Argonauta (IEN/CNEN). To observe the images from bacteria in each sample under an optical microscope, the sheets were chemically developed. The analysis of the images revealed morphologic differences among the genera (Gram positive from Gram-negative and coccus from bacillus), in samples containing either isolated or mixed bacteria. We thus verified the viability of the technique to achieve morphological characterization of different microorganisms. A quantitative approach seemed also to be feasible with the technique. The whole process took about 2 hours. (author)

  4. Perfecting of a computer program for PIXE quantitative analysis

    International Nuclear Information System (INIS)

    PIXE technique is used to measure element abundance in geological, archaeological samples, in medicine ...In this thesis, the author recalls the theoretical bases of this analysis method, gives the calculation method of element ratios that he has used for thick samples, describes the computer programs (XMONO and PIXCO) and then verifies experimentally his results that he has obtained with the PIXCO program results. The comparative studies show that PIXCO program gives good results for the elements of 20< Z<40 but ignores X secondary emission. PIXCO program is interesting for the analysis of next samples in series

  5. Chemical aspects of nuclear methods of analysis

    International Nuclear Information System (INIS)

    This final report includes papers which fall into three general areas: development of practical pre-analysis separation techniques, uranium/thorium separation from other elements for analytical and processing operations, and theory and mechanism of separation techniques. A separate abstract was prepared for each of the 9 papers

  6. Arrays in biological and chemical analysis

    DEFF Research Database (Denmark)

    Christensen, Claus Bo Vöge

    2002-01-01

    Recently a dramatic change has happened for biological and biochemical analysis. Originally developed as an academic massive parallel screening tool, industry has caught the idea as well of performing all kinds of assays in the new format of microarrays. From food manufacturers over water supply...

  7. Application of physico-chemical procedures in the analysis of urinary calculi

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A.L.

    1985-01-01

    All physico-chemical techniques used in the analysis of urinary calculi have inherent advantages and limitations. Although x-ray powder diffraction can identify constituents unambiguously, certain minor components can be missed. Infrared spectroscopy is more sensitive but band assignment at low concentrations is difficult. Scanning electron microscopy together with energy dispersive x-ray analysis permits the simultaneous investigation of morphology and chemical microstructure. With the electron microprobe, minor constituents can be detected but tedious sample preparation procedures are required. Transmission electron microscopy is extremely useful in determining constituent inter-relationships and ultrastructure but ultramicrotomy is very difficult. Thermal gravimetric analysis gives quantitative information easily but does not satisfactorily distinguish between struvite and brushite. In an attempt to assess the accuracy of chemical analyses, 62 calculi were investigated applying several chemical tests. Those for MgS , PO4(T ), NHU and uric acid proved highly reliable while that for CaS often yielded an incorrect result. The test for oxalate was totally unsatisfactory. Investigators of stone composition and structure should include x-ray diffraction (or infrared spectroscopy) and scanning electron microscopy as their methods of first choice. In addition, chemical or thermogravimetric analyses should be utilized in an auxiliary capacity.

  8. Field Deployable Chemical Redox Probe for Quantitative Characterization of Carboxymethylcellulose Modified Nano Zerovalent Iron.

    Science.gov (United States)

    Fan, Dimin; Chen, Shengwen; Johnson, Richard L; Tratnyek, Paul G

    2015-09-01

    Nano zerovalent iron synthesized with carboxymethylcelluose (CMC-nZVI) is among the leading formulations of nZVI currently used for in situ groundwater remediation. The main advantage of CMC-nZVI is that it forms stable suspensions, which are relatively mobile in porous media. Rapid contaminant reduction by CMC-nZVI is well documented, but the fate of the CMC-nZVI (including "aging" and "reductant demand") is not well characterized. Improved understanding of CMC-nZVI fate requires methods with greater specificity for Fe(0), less vulnerability to sampling/recovery artifacts, and more practical application in the field. These criteria can be met with a simple and specific colorimetric approach using indigo-5,5'-disulfonate (I2S) as a chemical redox probe (CRP). The measured stoichiometric ratio for reaction between I2S and nZVI is 1.45 ± 0.03, suggesting complete oxidation of nZVI to Fe(III) species. However, near pH 7, reduction of I2S is diagnostic for Fe(0), because aqueous Fe(II) reduces I2S much more slowly than Fe(0). At that pH, adding Fe(II) increased I2S reduction rates by Fe(0), consistent with depassivation of nZVI, but did not affect the stoichiometry. Using the I2S assay to quantify changes in the Fe(0) content of CMC-nZVI, the rate of Fe(0) oxidation by water was found to be orders of magnitude faster than previously reported values for other types of nZVI. PMID:26218836

  9. On-site semi-quantitative analysis for ammonium nitrate detection using digital image colourimetry.

    Science.gov (United States)

    Choodum, Aree; Boonsamran, Pichapat; NicDaeid, Niamh; Wongniramaikul, Worawit

    2015-12-01

    Digital image colourimetry was successfully applied in the semi-quantitative analysis of ammonium nitrate using Griess's test with zinc reduction. A custom-built detection box was developed to enable reproducible lighting of samples, and was used with the built-in webcams of a netbook and an ultrabook for on-site detection. The webcams were used for colour imaging of chemical reaction products in the samples, while the netbook was used for on-site colour analysis. The analytical performance was compared to a commercial external webcam and a digital single-lens reflex (DSLR) camera. The relationship between Red-Green-Blue intensities and ammonium nitrate concentration was investigated. The green channel intensity (IG) was the most sensitive for the pink-violet products from ammonium nitrate that revealed a spectrometric absorption peak at 546 nm. A wide linear range (5 to 250 mgL⁻¹) with a high sensitivity was obtained with the built-in webcam of the ultrabook. A considerably lower detection limit (1.34 ± 0.05mgL⁻¹) was also obtained using the ultrabook, in comparison with the netbook (2.6 ± 0.2 mgL⁻¹), the external web cam (3.4 ± 0.1 mgL⁻¹) and the DSLR (8.0 ± 0.5 mgL⁻¹). The best inter-day precision (over 3 days) was obtained with the external webcam (0.40 to 1.34%RSD), while the netbook and the ultrabook had 0.52 to 3.62% and 1.25 to 4.99% RSDs, respectively. The relative errors were +3.6, +5.6 and -7.1%, on analysing standard ammonium nitrate solutions of known concentration using IG, for the ultrabook, the external webcam, and the netbook, respectively, while the DSLR gave -4.4% relative error. However, the IG of the pink-violet reaction product suffers from interference by soil, so that blank subtraction (|IG-IGblank| or |AG-AGblank|) is recommended for soil sample analysis. This method also gave very good accuracies of -0.11 to -5.61% for spiked soil samples and the results presented for five seized samples showed good correlations between

  10. Quantitative Sulfur Analysis using Stand-off Laser-Induced Breakdown Spectroscopy

    Science.gov (United States)

    Dyar, M. D.; Tucker, J. M.; Clegg, S. M.; Barefield, J. E.; Wiens, R. C.

    2008-12-01

    The laser-induced breakdown spectrometer (LIBS) in the ChemCam instrument on Mars Science Laboratory has the capability to produce robust, quantitative analyses not only for major elements, but also for a large range of light elements and trace elements that are of great interest to geochemists. However, sulfur presents a particular challenge because it reacts easily with oxygen in the plasma and because the brightest S emission lines lie outside ChemCam's spectral range. This work was undertaken within the context of our larger effort to identify and compensate for matrix effects, which are chemical properties of the material that influence the ratio of a given emission line to the abundance of the element producing that line. Samples for this study include two suites of rocks: a suite of 12 samples that are mixtures of sulfate minerals and host rocks, generally with high S contents (0.1-26.0 wt% S), and a large suite of 118 igneous rocks from varying parageneses with S contents in the 0-2 wt% range. These compositions provide several different types of matrices to challenge our calibration procedures. Samples were analyzed under ChemCam-like conditions: a Nd:YAG laser producing 17 mJ per 10ns pulse was directed onto samples positioned 5-9 m away from the laser and tele­scope. The samples were placed in a vacuum chamber filled with 7 Torr CO2 to replicate the Martian surface pressure as the atmospheric pressure influences the LIBS plasma. Some of the LIBS plasma emission is collected with a telescope and transmitted through a 1 m, 300 um, 0.22NA optical fiber connected to a commercial Ocean Optics spectrometer. We are testing and comparing three different strategies to evaluate sulfur contents. 1) We have calculated regression lines comparing the intensity at each channel to the S content. This analysis shows that there are dozens of S emission lines in the ChemCam wavelength range that are suitable for use in quantitative analysis, even in the presence of Fe. 2

  11. Quantitative electron microscope autoradiography: application of multiple linear regression analysis

    International Nuclear Information System (INIS)

    A new method for the analysis of high resolution EM autoradiographs is described. It identifies labelled cell organelle profiles in sections on a strictly statistical basis and provides accurate estimates for their radioactivity without the need to make any assumptions about their size, shape and spatial arrangement. (author)

  12. Automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, Marcel; Spreeuwers, Luuk; Quist, Marcel

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the

  13. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, L.J.; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of th

  14. Concentration Analysis: A Quantitative Assessment of Student States.

    Science.gov (United States)

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  15. Quantitative analysis of natural resource management options at different scales

    NARCIS (Netherlands)

    Keulen, van H.

    2007-01-01

    Natural capital (land, water, air) consists of many resources, each with its own quality, dynamics and renewability, but with strong interactions. The increasing competition for the natural resources, especially land and water, calls for a basic redirection in the analysis of land use. In this paper

  16. Quantitative analysis of the clinical data on leukemia, 5

    International Nuclear Information System (INIS)

    In order to determine the necessity of chromosome analysis required for the evaluation of 8;21 translocation, multiple logistic discriminant analysis was made on 124 patients with acute non-lymphocytic leukemia experienced in the authors' institution. Variables which showed positive correlation with the presence of 8;21 translocation were the presence of Auer body and granular abnormality of the cells, numbers of peripheral promyelocytes, myelocytes and metamyelocytes, and bone marrow promyelocytes, myelocytes, and the sum of rods and segments. Those which showed negative correlation with 8;21 translocation were peripheral platelet count, neutrocytealkaline phosphatase (N-AP) score, numbers of eosinocytes, monocytes and erythroblasts, and erythroblasts on myelogram. Auer body, four peripheral hematological features (platelet count, N-AP score, metamyelocytes and monocytes), and three myelogram features (myelocytes, reticular cells and granulocytes/eosionocytes) were used for the multiple logistic discriminant analysis. By the analysis, 2 of the 22 patients (9.1%) with translocation were judged not to have 8;21 translocation and 3 of the 102 patients (2.9%) without translocation were judged to have it. Therefore, this multiple logistic discriminant method has proved to be simple and useful in clinically evaluating acute non-lymphocytic leukemia. (Namekawa, K.)

  17. From POOSL to UPPAAL: Transformation and Quantitative Analysis

    NARCIS (Netherlands)

    Xing, Jiansheng; Theelen, B.D.; Langerak, Rom; Pol, van de Jaco; Tretmans, Jan; Voeten, J.P.M.; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    2010-01-01

    POOSL (Parallel Object-Oriented Specification Language) is a powerful general purpose system-level modeling language. In research on design space exploration of motion control systems, POOSL has been used to construct models for performance analysis. The considered motion control algorithms are char

  18. Quantitative analysis and purity evaluation of medroxyprogesterone acetate by HPLC.

    Science.gov (United States)

    Cavina, G; Valvo, L; Alimenti, R

    1985-01-01

    A reversed-phase high-performance liquid chromatographic method was developed for the assay of medroxyprogesterone acetate and for the detection and determination of related steroids present as impurities in the drug. The method was compared with the normal-phase technique of the USP XX and was also applied to the analysis of tablets and injectable suspensions. PMID:16867645

  19. Mass spectrometry for real-time quantitative breath analysis

    Czech Academy of Sciences Publication Activity Database

    Smith, D.; Španěl, Patrik; Herbig, J.; Beauchamp, J.

    2014-01-01

    Roč. 8, č. 2 (2014), 027101. ISSN 1752-7155 Institutional support: RVO:61388955 Keywords : breath analysis * proton transfer reaction mass spectrometry * selected ion flow tube mass spectrometry Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.631, year: 2014

  20. Analysis of blood spots for polyfluoroalkyl chemicals

    International Nuclear Information System (INIS)

    Polyfluoroalkyl chemicals (PFCs) have been detected in humans, in the environment, and in ecosystems around the world. The potential for developmental and reproductive toxicities of some PFCs is of concern especially to children's health. In the United States, a sample of a baby's blood, called a 'dried blood spot' (DBS), is obtained from a heel stick within 48 h of a child's birth. DBS could be useful for assessing prenatal exposure to PFCs. We developed a method based on online solid phase extraction coupled with high performance liquid chromatography-isotope dilution tandem mass spectrometry for measuring four PFCs in DBS, perfluorooctane sulfonate (PFOS), perfluorohexane sulfonate, perfluorooctanoate (PFOA), and perfluorononanoate. The analytical limits of detection using one whole DBS (∼75 μL of blood) were -1. To validate the method, we analyzed 98 DBS collected in May 2007 in the United States. PFOS and PFOA were detected in all DBS at concentrations in the low ng mL-1 range. These data suggest that DBS may be a suitable matrix for assessing perinatal exposure to PFCs, but additional information related to sampling and specimen storage is needed to demonstrate the utility of these measures for assessing exposure.

  1. Isotopes in chemical analysis for water management

    International Nuclear Information System (INIS)

    Surface or underground water circulations and interactions are more and more often studied with the help of geochemistry and more particularly by using isotopic tracers. These isotopic tracer techniques allow, in particular, to define for each system under study, the natural or anthropic origin of the chemical elements, their behaviour, their transport in the different compartments, the circulation schemes of deep fluids and their interaction with the surrounding rocks. This article presents: 1 - the isotopes: definition, measurements and uses (stable and instable isotopes, measurement means, some examples: stable isotopes of the water molecule, boron isotopes, sulfur and oxygen isotopes of sulfates, strontium isotopes, nitrogen isotopes of nitrates); 2 - isotopes and water cycle: atmospheric tracing (rainfall signal at the drainage basin scale and at the country scale, aerosols characterization in urban areas), management of alluvial aquifers, underground waters and origin of nitrogenous contaminations, underground and surface waters in the context of aquifer floods: the case of the Somme basin, underground waters at the basin scale: heterogeneities, interactions and management processes (stable isotopes of the water molecule, S and O isotopes of dissolved sulfates, strontium isotopes); 3 - conclusion. (J.S.)

  2. A quantitative analysis of contractility in active cytoskeletal protein networks.

    Science.gov (United States)

    Bendix, Poul M; Koenderink, Gijsje H; Cuvelier, Damien; Dogic, Zvonimir; Koeleman, Bernard N; Brieher, William M; Field, Christine M; Mahadevan, L; Weitz, David A

    2008-04-15

    Cells actively produce contractile forces for a variety of processes including cytokinesis and motility. Contractility is known to rely on myosin II motors which convert chemical energy from ATP hydrolysis into forces on actin filaments. However, the basic physical principles of cell contractility remain poorly understood. We reconstitute contractility in a simplified model system of purified F-actin, muscle myosin II motors, and alpha-actinin cross-linkers. We show that contractility occurs above a threshold motor concentration and within a window of cross-linker concentrations. We also quantify the pore size of the bundled networks and find contractility to occur at a critical distance between the bundles. We propose a simple mechanism of contraction based on myosin filaments pulling neighboring bundles together into an aggregated structure. Observations of this reconstituted system in both bulk and low-dimensional geometries show that the contracting gels pull on and deform their surface with a contractile force of approximately 1 microN, or approximately 100 pN per F-actin bundle. Cytoplasmic extracts contracting in identical environments show a similar behavior and dependence on myosin as the reconstituted system. Our results suggest that cellular contractility can be sensitively regulated by tuning the (local) activity of molecular motors and the cross-linker density and binding affinity. PMID:18192374

  3. Quantitative analysis of the nephron during human fetal kidney development

    Directory of Open Access Journals (Sweden)

    Daković-Bjelaković Marija Z.

    2005-01-01

    end of the intrauterine development (LM X when corpuscles occupied 16.73% of the cortical volume. The volume density of the developing nephrons (corpuscular and tubular portion showed the significant positive correlation (r = 0.85; p<0.01 with gestational age. Conclusion. The present study was one of few quantitative studies of the human developing nephron. Knowledge about the normal development of the human kidney should be important for the future medical practice.

  4. Quantitative analysis of cell-free DNA in ovarian cancer

    Science.gov (United States)

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  5. Analysis of MRI and RI defecogram for quantitative evaluation of defecation

    Energy Technology Data Exchange (ETDEWEB)

    Tsukahara, Yuuki; Ikawa, Hiromichi; Okamoto, Shinya; Masuyama, Hiroaki; Taniuchi, Mayumi; Okajima, Hideaki; Konuma, Kunio; Kohno, Miyuki [Kanazawa Medical Univ., Uchinada, Ishikawa (Japan)

    2001-12-01

    For quantitative analysis of the rectal emptying, RI defecogram was performed. The object were Hirschsprung's disease 9, anorectal anomary 6 patients. Evacuation time of Hirschsprung's disease group and anorectal anomary group were significantly prolonged compared with that of normal group. RI defecogram revealed that the rectal evacuation of Hirschsprung's disease group and anorectal anomary group were impaired. RI defecogram is useful for quantitative evacuation of defecation. (author)

  6. The Brain Network for Deductive Reasoning: A Quantitative Meta-analysis of 28 Neuroimaging Studies

    OpenAIRE

    Prado, Jérôme; Chadha, Angad; Booth, James R.

    2011-01-01

    Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that neuroimaging findings are inconsistent is not based on any quantitative evidence. Here, we report the results of a quantitative meta-analysis of 28 neuroima...

  7. Reliability centered maintenance (RCM): quantitative analysis of an induction electric furnace

    OpenAIRE

    Diego Santos Cerveira; Miguel Afonso Sellitto

    2015-01-01

    The purpose of this article is to define a maintenance strategy for an electric induction furnace, installed on a special steels Foundry. The research method was the quantitative modeling. The proposed method is based on Reliability-Centered Maintenance (RCM), applied to industrial equipment. Quantitative analysis of reliability, availability and maintainability were used as support the definition of the maintenance strategy of the equipment. For research, historical data were collected from ...

  8. Quantitative Phosphoproteomics Analysis of Nitric Oxide–Responsive Phosphoproteins in Cotton Leaf

    OpenAIRE

    Fan, Shuli; Meng, Yanyan; Song, Meizhen; Pang, Chaoyou; Wei, Hengling; Liu, Ji; Zhan, Xianjin; Lan, Jiayang; Feng, Changhui; Zhang, Shengxi; Yu, Shuxun

    2014-01-01

    Knowledge of phosphorylation events and their regulation is crucial to understanding the functional biology of plant proteins, but very little is currently known about nitric oxide–responsive phosphorylation in plants. Here, we report the first large-scale, quantitative phosphoproteome analysis of cotton (Gossypium hirsutum) treated with sodium nitroprusside (nitric oxide donor) by utilizing the isobaric tag for relative and absolute quantitation (iTRAQ) method. A total of 1315 unique phospho...

  9. Uncertainty and sensitivity analysis in quantitative pest risk assessments; practical rules for risk assessors

    OpenAIRE

    David Makowski

    2013-01-01

    Quantitative models have several advantages compared to qualitative methods for pest risk assessments (PRA). Quantitative models do not require the definition of categorical ratings and can be used to compute numerical probabilities of entry and establishment, and to quantify spread and impact. These models are powerful tools, but they include several sources of uncertainty that need to be taken into account by risk assessors and communicated to decision makers. Uncertainty analysis (UA) and ...

  10. Volume-Rendering-Based Interactive 3D Measurement for Quantitative Analysis of 3D Medical Images

    OpenAIRE

    Yakang Dai; Jian Zheng; Yuetao Yang; Duojie Kuai; Xiaodong Yang

    2013-01-01

    3D medical images are widely used to assist diagnosis and surgical planning in clinical applications, where quantitative measurement of interesting objects in the image is of great importance. Volume rendering is widely used for qualitative visualization of 3D medical images. In this paper, we introduce a volume-rendering-based interactive 3D measurement framework for quantitative analysis of 3D medical images. In the framework, 3D widgets and volume clipping are integrated with volume render...

  11. The Impact of Remittances on Human Development: A Quantitative Analysis and Policy Implications.

    OpenAIRE

    Üstübici, Ayşen; Irdam, Darja

    2012-01-01

    Abstract: This paper contributes to the discussions on the nexus between migration and development by assessing the effects of remittances on human development. We do so first through a quantitative approach, and second, by elaborating the findings of our quantitative analysis within a broader theoretical and policy framework. By using OLS, we measure the impact of remittances on human development and compare it with the effect of foreign direct investment (FDI) and official development assis...

  12. A quantitative method for the characterisation of karst aquifers based on spring hydrograph analysis

    OpenAIRE

    Kovács, Attila; Perrochet, Pierre; Király, László; Jeannin, Pierre-Yves

    2016-01-01

    This paper presents a method for characterizing flow systems in karst aquifers by acquiring quantitative information about the geometric and hydraulic aquifer parameters from spring hydrograph analysis. Numerical sensitivity analyses identified two fundamentally different flow domains, depending on the overall configuration of aquifer parameters. These two domains have been quantitatively characterized by deducing analytical solutions for the global hydraulic response of simple two-dimensiona...

  13. Geothermal Power Plant Maintenance: Evaluating Maintenance System Needs Using Quantitative Kano Analysis

    OpenAIRE

    Reynir S. Atlason; Gudmundur V. Oddsson; Runar Unnthorsson

    2014-01-01

    A quantitative Kano model is used in this study to identify which features are preferred by top-level maintenance engineers within Icelandic geothermal power plants to be implemented in a maintenance tool or software. Visits were conducted to the largest Icelandic energy companies operating geothermal power plants. Thorough interviews with chiefs of operations and maintenance were used as a basis for a quantitative Kano analysis. Thirty seven percent of all maintenance engineers at Reykjavik ...

  14. Quantitatively driven visualization and analysis on emerging architectures

    International Nuclear Information System (INIS)

    We live in a world of ever-increasing amounts of information that is not only dynamically changing but also dramatically changing in complexity. This trend of 'information overload' has quickly overwhelmed our capabilities to explore, hypothesize, and thus fully interpret the underlying details in these data. To further complicate matters, the computer architectures that have traditionally provided improved performance are undergoing a revolutionary change as manufacturers transition to building multi- and many-core processors. While these trends have the potential to lead to new scientific breakthroughs via simulation and modeling, they will do so in a disruptive manner, potentially placing a significant strain on software development activities including the overall data analysis process. In this paper we explore an approach that exploits these emerging architectures to provide an integrated environment for high-performance data analysis and visualization

  15. Quantitative Safety and Security Analysis from a Communication Perspective

    Directory of Open Access Journals (Sweden)

    Boris Malinowsky

    2015-12-01

    Full Text Available This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective on the communication protocols. The results are obtained using the network simulator ns-3.

  16. Quantitative Safety and Security Analysis from a Communication Perspective

    DEFF Research Database (Denmark)

    Malinowsky, Boris; Schwefel, Hans-Peter; Jung, Oliver

    2014-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real......-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look at...... handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective on the...

  17. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    Science.gov (United States)

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  18. Quantitative analysis of structural neuroimaging of mesial temporal lobe epilepsy

    OpenAIRE

    Memarian, N; Thompson, PM; Engel, J.; Staba, RJ

    2013-01-01

    Mesial temporal lobe epilepsy (MTLE) is the most common of the surgically remediable drug-resistant epilepsies. MRI is the primary diagnostic tool to detect anatomical abnormalities and, when combined with EEG, can more accurately identify an epileptogenic lesion, which is often hippocampal sclerosis in cases of MTLE. As structural imaging technology has advanced the surgical treatment of MTLE and other lesional epilepsies, so too have the analysis techniques that are used to measure differen...

  19. Towards a quantitative analysis of magnetic force microscopy data matrices

    Energy Technology Data Exchange (ETDEWEB)

    Chiolerio, A., E-mail: alessandro.chiolerio@iit.it [Applied Science and Technology Department, Politecnico di Torino, Corso Duca degli Abruzzi 24, IT-10129 Torino (Italy); IIT-Italian Institute of Technology at POLITO, Center for Space Human Robotics, Corso Trento 21, IT-10129 Torino (Italy); Allia, P. [Applied Science and Technology Department, Politecnico di Torino, Corso Duca degli Abruzzi 24, IT-10129 Torino (Italy)

    2012-08-15

    Fast and efficient software tools previously developed in image processing were adapted to the analysis of raw datasets consisting of multiple stacks of images taken on a sample interacting with a measuring instrument and submitted to the effect of an external parameter. Magnetic force microscopy (MFM), a follow-up of atomic force microscopy (AFM), was selected as a first testbed example. In MFM, a specifically developed ferromagnetic scanning tip probes the stray magnetic field generated from a ferromagnetic specimen. Raw scanning probe images taken on soft patterned magnetic materials and continuous thin films were used, together with synthetic patterns exploited to assess the absolute performance ability of the proposed texture analysis tools. In this case, the parameter affecting the sample-instrument interaction is the applied magnetic field. The application discussed here is just one among the many possible, including, e.g., real-time microscopy images (both optical and electronic) taken during heat treatments, phase transformations and so on. Basically any image exhibiting a texture with a characteristic spatial or angular dependence could be processed by the proposed method. Standard imaging tools such as texture mapping and novel data representation schemes such as texture analysis, feature extraction and classification are discussed. A magnetic texture stability diagram will be presented as an original output of the entropic analysis on MFM datasets. - Highlights: Black-Right-Pointing-Pointer Texture mapping was used to combine MFM maps with AFM ones. Black-Right-Pointing-Pointer A new representation scheme, called texture trajectory diagram (TTD), was proposed. Black-Right-Pointing-Pointer A single variable with radial resolution, called consistency, was introduced. Black-Right-Pointing-Pointer A magnetic texture stability diagram based on entropy was proposed. Black-Right-Pointing-Pointer These numeric instruments were used to evaluate 2-D features of

  20. Flow of funds analysis : BOJ quantitative monetary policy examined

    OpenAIRE

    辻村, 和佑; Mizoshita, Masako

    2001-01-01

    Flow of funds (FOF) analysis has been stem from "Social Accounting for Money flows"authored by Morris Copeland in 1949. Since then it has developed as an accountingsystem describing the inter-sectoral financial transactions between the economic actors .FOF accounts were included in the System of National Accounts in 1968 along withNational Income Accounts, National Balance Sheet, Balance of International PaymentsAccounts and Input-Output Tables. FOF Accounts consist of balance sheets of theec...

  1. Quantitative Analysis of Criteria in University Building Maintenance in Malaysia

    OpenAIRE

    Olanrewaju Ashola Abdul-Lateef

    2010-01-01

    University buildings are a significant part of university assets and considerable resources are committed to their design, construction and maintenance. The core of maintenance management is to optimize productivity and user satisfaction with optimum resources. An important segment in the maintenance management system is the analysis of criteria that influence building maintenance. Therefore, this paper aims to identify quantify, rank and discuss the criteria that influence maintenance costs,...

  2. Quantitative Analysis of Gray and White Matter in Williams Syndrome

    OpenAIRE

    Faria, Andreia Vasconcellos; Landau, Barbara; O’Hearn, Kirsten M.; Li, Xin; Jiang, Hangyi; Oishi, Kenichi; Zhang, Jiangyang; Mori, Susumu

    2012-01-01

    Williams Syndrome is a developmental disorder with a genetic basis, which results in an uneven cognitive profile with relatively strong language skills and severely impaired visuospatial abilities. To better understand the brain structure underlying this profile, we compared individuals with Williams Syndrome to controls using multimodal neuroimaging data and new analytic methods (diffeomorphic mapping and atlas-based analysis). People with Williams Syndrome had basal ganglia atrophy, while t...

  3. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  4. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    Science.gov (United States)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  5. Quantitative Study and Analysis of Physical Education Teaching Based on Origin

    OpenAIRE

    XingZhi Li

    2013-01-01

    By means of Origin, this study extracts randomly four curriculums of students and conducts descriptive analysis, correlation analysis and regression analysis. The purpose is to better understand the quantitative data analysis in physical education. Besides, cognize and perfect the teaching program from such perspective. Finally, according to the main result, we propose several pieces of advice. We make some complement and extension for physical education area.

  6. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    Science.gov (United States)

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  7. Visualisation and quantitative analysis of flat continuous water jet structure

    Czech Academy of Sciences Publication Activity Database

    Ščučka, Jiří; Zeleňák, Michal; Foldyna, Josef; Lehocká, D.; Votavová, H.

    Ostrava: Ústav geoniky AV ČR, v.v.i, 2015 - (Sitek, L.; Klichová, D.), s. 195-205 ISBN 978-80-86407-56-2. [Vodní paprsek 2015 - výzkum, vývoj, aplikace. Velké Losiny (CZ), 06.10.2015-08.10.2015] R&D Projects: GA MŠk ED2.1.00/03.0082; GA MŠk(CZ) LO1406 Institutional support: RVO:68145535 Keywords : descaling * flat continuous water jet * visualisation * shadowgraph technique * image analysis Subject RIV: JQ - Machines ; Tools

  8. Quantitative analysis of mouse corpus callosum from electron microscopy images

    Directory of Open Access Journals (Sweden)

    Kathryn L. West

    2015-12-01

    Full Text Available This article provides morphometric analysis of 72 electron microscopy images from control (n=4 and hypomyelinated (n=2 mouse corpus callosum. Measures of axon diameter and g-ratio were tabulated across all brains from two regions of the corpus callosum and a non-linear relationship between axon diameter and g-ratio was observed. These data are related to the accompanying research article comparing multiple methods of measuring g-ratio entitled ‘A revised model for estimating g-ratio from MRI’ (West et al., NeuroImage, 2015.

  9. Interleukin-2 signaling pathway analysis by quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Osinalde, Nerea; Moss, Helle; Arrizabalaga, Onetsine;

    2011-01-01

    by IL-2, we aimed to define the global tyrosine-phosphoproteome of IL-2 pathway in human T cell line Kit225 using high resolution mass spectrometry combined with phosphotyrosine immunoprecipitation and SILAC. The molecular snapshot at 5min of IL-2 stimulation resulted in identification of 172...... proteins among which 79 were found with increased abundance in the tyrosine-phosphorylated complexes, including several previously not reported IL-2 downstream effectors. Combinatorial site-specific phosphoproteomic analysis resulted in identification of 99 phosphorylated sites mapping to the identified...

  10. Chemical form analysis method of particulate nickel compounds

    International Nuclear Information System (INIS)

    Chemical form of nickel is metallic nickel, nickel oxide and nickel ferrite in the PWR primary chemistry condition. The distribution of chemical form depends on Ni/Fe ratio and chemistry condition, especially dissolved hydrogen concentration. Nickel is parent element of Co-58 and the chemical form is important for Co-58 generation. A method of chemical form analysis of nickel has been developed. This method uses the difference in dissolution characteristics of nickel compounds. Metallic nickel and others are separated by nitric acid, and others are divided to nickel oxide and nickel ferrite by oxalic acid. Some cruds in the primary coolant of a PWR were analyzed by using this method. The method is not complex and available at chemical laboratory in a nuclear power plant. (author)

  11. Histopathological image analysis of chemical-induced hepatocellular hypertrophy in mice.

    Science.gov (United States)

    Asaoka, Yoshiji; Togashi, Yuko; Mutsuga, Mayu; Imura, Naoko; Miyoshi, Tomoya; Miyamoto, Yohei

    2016-04-01

    Chemical-induced hepatocellular hypertrophy is frequently observed in rodents, and is mostly caused by the induction of phase I and phase II drug metabolic enzymes and peroxisomal lipid metabolic enzymes. Liver weight is a sensitive and commonly used marker for detecting hepatocellular hypertrophy, but is also increased by a number of other factors. Histopathological observations subjectively detect changes such as hepatocellular hypertrophy based on the size of a hepatocyte. Therefore, quantitative microscopic observations are required to evaluate histopathological alterations objectively. In the present study, we developed a novel quantitative method for an image analysis of hepatocellular hypertrophy using liver sections stained with hematoxylin and eosin, and demonstrated its usefulness for evaluating hepatocellular hypertrophy induced by phenobarbital (a phase I and phase II enzyme inducer) and clofibrate (a peroxisomal enzyme inducer) in mice. The algorithm of this imaging analysis was designed to recognize an individual hepatocyte through a combination of pixel-based and object-based analyses. Hepatocellular nuclei and the surrounding non-hepatocellular cells were recognized by the pixel-based analysis, while the areas of the recognized hepatocellular nuclei were then expanded until they ran against their expanding neighboring hepatocytes and surrounding non-hepatocellular cells by the object-based analysis. The expanded area of each hepatocellular nucleus was regarded as the size of an individual hepatocyte. The results of this imaging analysis showed that changes in the sizes of hepatocytes corresponded with histopathological observations in phenobarbital and clofibrate-treated mice, and revealed a correlation between hepatocyte size and liver weight. In conclusion, our novel image analysis method is very useful for quantitative evaluations of chemical-induced hepatocellular hypertrophy. PMID:26776450

  12. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  13. A quantitative comparison of functional MRI cluster analysis.

    Science.gov (United States)

    Dimitriadou, Evgenia; Barth, Markus; Windischberger, Christian; Hornik, Kurt; Moser, Ewald

    2004-05-01

    The aim of this work is to compare the efficiency and power of several cluster analysis techniques on fully artificial (mathematical) and synthesized (hybrid) functional magnetic resonance imaging (fMRI) data sets. The clustering algorithms used are hierarchical, crisp (neural gas, self-organizing maps, hard competitive learning, k-means, maximin-distance, CLARA) and fuzzy (c-means, fuzzy competitive learning). To compare these methods we use two performance measures, namely the correlation coefficient and the weighted Jaccard coefficient (wJC). Both performance coefficients (PCs) clearly show that the neural gas and the k-means algorithm perform significantly better than all the other methods using our setup. For the hierarchical methods the ward linkage algorithm performs best under our simulation design. In conclusion, the neural gas method seems to be the best choice for fMRI cluster analysis, given its correct classification of activated pixels (true positives (TPs)) whilst minimizing the misclassification of inactivated pixels (false positives (FPs)), and in the stability of the results achieved. PMID:15182847

  14. Quantitative and qualitative analysis of hydrosoluble organic matter in bitumen leachates

    OpenAIRE

    Walczak, Isabelle; Libert, Marie-Françoise; Camaro, Sylvie; Blanchard, Jean-Marie

    2001-01-01

    International audience Bitumen is currently used as an embedding matrix for low and intermediate level radioactive waste disposal in deep sediments. High impermeability and a great resistance to most chemicals are the two main properties sought. Indeed, the generation of water-soluble organic complexing agents could affect the integrity of the wasteform due to an increase of radionuclide solubility. Therefore, the aim of this study is the quantitative and qualitative characterisation of so...

  15. Quantitative Analysis of Bloggers Collective Behavior Powered by Emotions

    CERN Document Server

    Mitrović, Marija; Tadić, Bosiljka

    2010-01-01

    Large-scale data resulting from users online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study emergence of the emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite network of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion-classifier developed for this type of texts. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and tempo...

  16. QUANTITATIVE ANALYSIS OF BANDED STRUCTURES IN DUAL-PHASE STEELS

    Directory of Open Access Journals (Sweden)

    Benoit Krebs

    2011-05-01

    Full Text Available Dual-Phase (DP steels are composed of martensite islands dispersed in a ductile ferrite matrix, which provides a good balance between strength and ductility. Current processing conditions (continuous casting followed by hot and cold rolling generate 'banded structures' i.e., irregular, parallel and alternating bands of ferrite and martensite, which are detrimental to mechanical properties and especially for in-use properties. We present an original and simple method to quantify the intensity and wavelength of these bands. This method, based on the analysis of covariance function of binary images, is firstly tested on model images. It is compared with ASTM E-1268 standard and appears to be more robust. Then it is applied on real DP steel microstructures and proves to be sufficiently sensitive to discriminate samples resulting from different thermo-mechanical routes.

  17. Quantitative analysis of overlapping XPS peaks by spectrum reconstruction

    DEFF Research Database (Denmark)

    Graat, Peter C.J.; Somers, Marcel A. J.

    1998-01-01

    The composition and thickness of thin iron oxide iron were evaluated from Fe 2p spectra as measured by X-ray photoelectron spectroscopy. To this end the experimental spectra were reconstructed from reference spectra of the constituents Fe/sup 0/, Fe/sup 2+/ and Fe/sup 3+/. The background...... contributions in the spectra owing to inelastic scattering of signal electrons were calculated from the depth distributions of these constituents and their reference spectra. In the reconstruction procedure the film thickness and the concentrations of Fe/sup 2+/ and Fe/sup 3+/ in the oxide film were used as fit...... parameters. The values obtained for the oxide film thickness were compared with thickness values determined from the intensity of the corresponding O 1s spectra and with thickness values resulting from ellipsometric analysis. The sensitivity of the reconstruction procedure with regard to film thickness and...

  18. Quantitative analysis of a fault tree with priority AND gates

    Energy Technology Data Exchange (ETDEWEB)

    Yuge, T. [Department of Electrical and Electronic Engineering, National Defense Academy, 1-10-20 Hashirimizu, Yokosuka 239-8686 (Japan)], E-mail: yuge@nda.ac.jp; Yanagi, S. [Department of Electrical and Electronic Engineering, National Defense Academy, 1-10-20 Hashirimizu, Yokosuka 239-8686 (Japan)], E-mail: shigeru@nda.ac.jp

    2008-11-15

    A method for calculating the exact top event probability of a fault tree with priority AND gates and repeated basic events is proposed when the minimal cut sets are given. A priority AND gate is an AND gate where the input events must occur in a prescribed order for the occurrence of the output event. It is known that the top event probability of such a dynamic fault tree is obtained by converting the tree into an equivalent Markov model. However, this method is not realistic for a complex system model because the number of states which should be considered in the Markov analysis increases explosively as the number of basic events increases. To overcome the shortcomings of the Markov model, we propose an alternative method to obtain the top event probability in this paper. We assume that the basic events occur independently, exponentially distributed, and the component whose failure corresponds to the occurrence of the basic event is non-repairable. First, we obtain the probability of occurrence of the output event of a single priority AND gate by Markov analysis. Then, the top event probability is given by a cut set approach and the inclusion-exclusion formula. An efficient procedure to obtain the probabilities corresponding to logical products in the inclusion-exclusion formula is proposed. The logical product which is composed of two or more priority AND gates having at least one common basic event as their inputs is transformed into the sum of disjoint events which are equivalent to a priority AND gate in the procedure. Numerical examples show that our method works well for complex systems.

  19. Quantitative analysis of a fault tree with priority AND gates

    International Nuclear Information System (INIS)

    A method for calculating the exact top event probability of a fault tree with priority AND gates and repeated basic events is proposed when the minimal cut sets are given. A priority AND gate is an AND gate where the input events must occur in a prescribed order for the occurrence of the output event. It is known that the top event probability of such a dynamic fault tree is obtained by converting the tree into an equivalent Markov model. However, this method is not realistic for a complex system model because the number of states which should be considered in the Markov analysis increases explosively as the number of basic events increases. To overcome the shortcomings of the Markov model, we propose an alternative method to obtain the top event probability in this paper. We assume that the basic events occur independently, exponentially distributed, and the component whose failure corresponds to the occurrence of the basic event is non-repairable. First, we obtain the probability of occurrence of the output event of a single priority AND gate by Markov analysis. Then, the top event probability is given by a cut set approach and the inclusion-exclusion formula. An efficient procedure to obtain the probabilities corresponding to logical products in the inclusion-exclusion formula is proposed. The logical product which is composed of two or more priority AND gates having at least one common basic event as their inputs is transformed into the sum of disjoint events which are equivalent to a priority AND gate in the procedure. Numerical examples show that our method works well for complex systems

  20. Characterisation of transmission Raman spectroscopy for rapid quantitative analysis of intact multi-component pharmaceutical capsules.

    Science.gov (United States)

    Hargreaves, Michael D; Macleod, Neil A; Smith, Mark R; Andrews, Darren; Hammond, Stephen V; Matousek, Pavel

    2011-02-20

    A detailed characterisation of the performance of transmission Raman spectroscopy was performed from the standpoint of rapid quantitative analysis of pharmaceutical capsules using production relevant formulations comprising of active pharmaceutical ingredient (API) and 3 common pharmaceutical excipients. This research builds on our earlier studies that identified the unique benefits of transmission Raman spectroscopy compared to conventional Raman spectroscopy. These include the ability to provide bulk information of the content of capsules, thus avoiding the sub-sampling problem, and the suppression of interference from the capsule shell. This study demonstrates, for the first time, the technique's insensitivity to the amount of material held within the capsules. Different capsules sizes with different overall fill weights (100-400 mg) and capsule shell colours were assayed with a single calibration model developed using only one weight and size sample set (100 mg) to a relative error of typically 5s acquisition time. Models built using the same calibration set also predicted the 3 low level excipients with relative errors of 5-15%. The quantity of API was also predicted (with a relative error within ∼3%) using the same model for capsules prepared with different generations of API (i.e. API manufactured via different processes). The study provides further foundation blocks for the establishment of this emerging technique as a routine pharmaceutical analysis tool, capitalising on the inherently high chemical specificity of Raman spectroscopy and the non-invasive nature of the measurement. Ultimately, this technique has significant promise as a Process Analytical Technology (PAT) tool for online production application. PMID:20947277