WorldWideScience

Sample records for sources including quantitative

  1. Spectro-refractometry of individual microscopic objects using swept-source quantitative phase imaging.

    Science.gov (United States)

    Jung, Jae-Hwang; Jang, Jaeduck; Park, Yongkeun

    2013-11-05

    We present a novel spectroscopic quantitative phase imaging technique with a wavelength swept-source, referred to as swept-source diffraction phase microscopy (ssDPM), for quantifying the optical dispersion of microscopic individual samples. Employing the swept-source and the principle of common-path interferometry, ssDPM measures the multispectral full-field quantitative phase imaging and spectroscopic microrefractometry of transparent microscopic samples in the visible spectrum with a wavelength range of 450-750 nm and a spectral resolution of less than 8 nm. With unprecedented precision and sensitivity, we demonstrate the quantitative spectroscopic microrefractometry of individual polystyrene beads, 30% bovine serum albumin solution, and healthy human red blood cells.

  2. Synchrotron radiation as a source for quantitative XPS: advantages and consequences

    International Nuclear Information System (INIS)

    Rosseel, T.M.; Carlson, T.A.; Negri, R.E.; Beall, C.E.; Taylor, J.W.

    1986-01-01

    Synchrotron radiation (SR) has a variety of properties which make it an attractive source for quantitative x-ray photoelectron spectroscopy (XPS). Among the most significant are high intensity and tunability. In addition, the intensity of the dispersed radiation is comparable to laboratory line sources. Synchrotron radiation is also a clean source, i.e., it will not contaminate the sample, because it operates under ultra-high vacuum conditions. We have used these properties to demonstrate the advantages of SR as a source for quantitative XPS. We have also found several consequences associated with this source which can either limit its use or provide unique opportunities for analysis and research. Using the tunability of SR, we have measured the energy dependence of the 3p photoionization cross sections of Ti, Cr, and Mn from 50 to 150 eV above threshold at the University of Wisconsin's Tantalus electron-storage ring

  3. Quantitative characterization of urban sources of organic aerosol by high-resolution gas chromatography

    International Nuclear Information System (INIS)

    Hildemann, L.M.; Mazurek, M.A.; Cass, G.R.; Simoneit, B.R.T.

    1991-01-01

    Fine aerosol emissions have been collected from a variety of urban combustion sources, including an industrial boiler, a fireplace, automobiles, diesel trucks, gas-fired home appliances, and meat cooking operations, by use of a dilution sampling system. Other sampling techniques have been utilized to collect fine aerosol samples of paved road dust, brake wear, tire wear, cigarette smoke, tar pot emissions, and vegetative detritus. The organic matter contained in each of these samples has been analyzed via high-resolution gas chromatography. By use of a simple computational approach, a quantitative, 50-parameter characterization of the elutable fine organic aerosol emitted from each source type has been determined. The organic mass distribution fingerprints obtained by this approach are shown to differ significantly from each other for most of the source types tested, using hierarchical cluster analysis

  4. Portable instrumentation for quantitatively measuring radioactive surface contaminations, including 90Sr

    International Nuclear Information System (INIS)

    Brodzinski, R.L.

    1983-10-01

    In order to measure the effectiveness of decontamination efforts, a quantitative analysis of the radiocontamination is necessary, both before and after decontamination. Since it is desirable to release the decontaminated material for unrestricted use or disposal, the assay equipment must provide adequate sensitivity to measure the radioactivity at or below the release limit. In addition, the instrumentation must be capable of measuring all kinds of radiocontaminants including fission products, activation products, and transuranic materials. Finally, the survey instrumentation must be extremely versatile in order to assay the wide variety of contaminated surfaces in many environments, some of which may be extremely hostile or remote. This communication describes the development and application of portable instrumentation capable of quantitatively measuring most transuranics, activation products, and fission products, including 90 Sr, on almost any contaminated surface in nearly any location

  5. Quantitative phase imaging of living cells with a swept laser source

    Science.gov (United States)

    Chen, Shichao; Zhu, Yizheng

    2016-03-01

    Digital holographic phase microscopy is a well-established quantitative phase imaging technique. However, interference artifacts from inside the system, typically induced by elements whose optical thickness are within the source coherence length, limit the imaging quality as well as sensitivity. In this paper, a swept laser source based technique is presented. Spectra acquired at a number of wavelengths, after Fourier Transform, can be used to identify the sources of the interference artifacts. With proper tuning of the optical pathlength difference between sample and reference arms, it is possible to avoid these artifacts and achieve sensitivity below 0.3nm. Performance of the proposed technique is examined in live cell imaging.

  6. A quantitative PGNAA study for use in aqueous solution measurements using Am–Be neutron source and BGO scintillation detector

    Energy Technology Data Exchange (ETDEWEB)

    Ghal-Eh, N., E-mail: ghal-eh@du.ac.ir [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Ahmadi, P. [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Doost-Mohammadi, V. [Nuclear Science and Technology Research Center, AEOI, P.O. Box 11365-8486, Tehran (Iran, Islamic Republic of)

    2016-02-01

    A prompt gamma neutron activation analysis (PGNAA) system including an Am–Be neutron source and BGO scintillation detector are used for quantitative analysis of bulk samples. Both Monte Carlo-simulated and experimental data are considered as input data libraries for two different procedures based on neural network and least squares methods. The results confirm the feasibility and precision of the proposed methods.

  7. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    Science.gov (United States)

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. SU-D-210-03: Limited-View Multi-Source Quantitative Photoacoustic Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Feng, J; Gao, H [Shanghai Jiao Tong University, Shanghai, Shanghai (China)

    2015-06-15

    Purpose: This work is to investigate a novel limited-view multi-source acquisition scheme for the direct and simultaneous reconstruction of optical coefficients in quantitative photoacoustic tomography (QPAT), which has potentially improved signal-to-noise ratio and reduced data acquisition time. Methods: Conventional QPAT is often considered in two steps: first to reconstruct the initial acoustic pressure from the full-view ultrasonic data after each optical illumination, and then to quantitatively reconstruct optical coefficients (e.g., absorption and scattering coefficients) from the initial acoustic pressure, using multi-source or multi-wavelength scheme.Based on a novel limited-view multi-source scheme here, We have to consider the direct reconstruction of optical coefficients from the ultrasonic data, since the initial acoustic pressure can no longer be reconstructed as an intermediate variable due to the incomplete acoustic data in the proposed limited-view scheme. In this work, based on a coupled photo-acoustic forward model combining diffusion approximation and wave equation, we develop a limited-memory Quasi-Newton method (LBFGS) for image reconstruction that utilizes the adjoint forward problem for fast computation of gradients. Furthermore, the tensor framelet sparsity is utilized to improve the image reconstruction which is solved by Alternative Direction Method of Multipliers (ADMM). Results: The simulation was performed on a modified Shepp-Logan phantom to validate the feasibility of the proposed limited-view scheme and its corresponding image reconstruction algorithms. Conclusion: A limited-view multi-source QPAT scheme is proposed, i.e., the partial-view acoustic data acquisition accompanying each optical illumination, and then the simultaneous rotations of both optical sources and ultrasonic detectors for next optical illumination. Moreover, LBFGS and ADMM algorithms are developed for the direct reconstruction of optical coefficients from the

  9. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  10. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    Science.gov (United States)

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  11. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  12. Energy-Water Nexus Relevant to Baseload Electricity Source Including Mini/Micro Hydropower Generation

    Science.gov (United States)

    Fujii, M.; Tanabe, S.; Yamada, M.

    2014-12-01

    Water, food and energy is three sacred treasures that are necessary for human beings. However, recent factors such as population growth and rapid increase in energy consumption have generated conflicting cases between water and energy. For example, there exist conflicts caused by enhanced energy use, such as between hydropower generation and riverine ecosystems and service water, between shale gas and ground water, between geothermal and hot spring water. This study aims to provide quantitative guidelines necessary for capacity building among various stakeholders to minimize water-energy conflicts in enhancing energy use. Among various kinds of renewable energy sources, we target baseload sources, especially focusing on renewable energy of which installation is required socially not only to reduce CO2 and other greenhouse gas emissions but to stimulate local economy. Such renewable energy sources include micro/mini hydropower and geothermal. Three municipalities in Japan, Beppu City, Obama City and Otsuchi Town are selected as primary sites of this study. Based on the calculated potential supply and demand of micro/mini hydropower generation in Beppu City, for example, we estimate the electricity of tens through hundreds of households is covered by installing new micro/mini hydropower generation plants along each river. However, the result is based on the existing infrastructures such as roads and electric lines. This means that more potentials are expected if the local society chooses options that enhance the infrastructures to increase micro/mini hydropower generation plants. In addition, further capacity building in the local society is necessary. In Japan, for example, regulations by the river law and irrigation right restrict new entry by actors to the river. Possible influences to riverine ecosystems in installing new micro/mini hydropower generation plants should also be well taken into account. Deregulation of the existing laws relevant to rivers and

  13. Quantitative determination of minor and trace elements in rocks and soils by spark source mass spectrometry

    International Nuclear Information System (INIS)

    Ure, A.M.; Bacon, J.R.

    1978-01-01

    Experimental details are given of the quantitative determination of minor and trace elements in rocks and soils by spark source mass spectrometry. The effects of interfering species, and corrections that can be applied, are discussed. (U.K.)

  14. Auralization of airborne sound insulation including the influence of source room

    DEFF Research Database (Denmark)

    Rindel, Jens Holger

    2006-01-01

    The paper describes a simple and acoustically accurate method for the auralization of airborne sound insulation between two rooms by means of a room acoustic simulation software (ODEON). The method makes use of a frequency independent transparency of the transmitting surface combined...... with a frequency dependent power setting of the source in the source room. The acoustic properties in terms of volume and reverberation time as well as the area of the transmitting surface are all included in the simulation. The user only has to select the position of the source in the source room and the receiver...... of the transmitting surface is used for the simulation of sound transmission. Also the reduced clarity of the auralization due to the reverberance of the source room is inherent in the method. Currently the method is restricted to transmission loss data in octave bands....

  15. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    Science.gov (United States)

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  16. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  17. 77 FR 6463 - Revisions to Labeling Requirements for Blood and Blood Components, Including Source Plasma...

    Science.gov (United States)

    2012-02-08

    ... Blood Components, Including Source Plasma; Correction AGENCY: Food and Drug Administration, HHS. ACTION..., Including Source Plasma,'' which provided incorrect publication information regarding a 60-day notice that...

  18. Space-time quantitative source apportionment of soil heavy metal concentration increments.

    Science.gov (United States)

    Yang, Yong; Christakos, George; Guo, Mingwu; Xiao, Lu; Huang, Wei

    2017-04-01

    Assessing the space-time trends and detecting the sources of heavy metal accumulation in soils have important consequences in the prevention and treatment of soil heavy metal pollution. In this study, we collected soil samples in the eastern part of the Qingshan district, Wuhan city, Hubei Province, China, during the period 2010-2014. The Cd, Cu, Pb and Zn concentrations in soils exhibited a significant accumulation during 2010-2014. The spatiotemporal Kriging technique, based on a quantitative characterization of soil heavy metal concentration variations in terms of non-separable variogram models, was employed to estimate the spatiotemporal soil heavy metal distribution in the study region. Our findings showed that the Cd, Cu, and Zn concentrations have an obvious incremental tendency from the southwestern to the central part of the study region. However, the Pb concentrations exhibited an obvious tendency from the northern part to the central part of the region. Then, spatial overlay analysis was used to obtain absolute and relative concentration increments of adjacent 1- or 5-year periods during 2010-2014. The spatial distribution of soil heavy metal concentration increments showed that the larger increments occurred in the center of the study region. Lastly, the principal component analysis combined with the multiple linear regression method were employed to quantify the source apportionment of the soil heavy metal concentration increments in the region. Our results led to the conclusion that the sources of soil heavy metal concentration increments should be ascribed to industry, agriculture and traffic. In particular, 82.5% of soil heavy metal concentration increment during 2010-2014 was ascribed to industrial/agricultural activities sources. Using STK and SOA to obtain the spatial distribution of heavy metal concentration increments in soils. Using PCA-MLR to quantify the source apportionment of soil heavy metal concentration increments. Copyright © 2017

  19. Perceived relevance and information needs regarding food topics and preferred information sources among Dutch adults: results of a quantitative consumer study

    NARCIS (Netherlands)

    Dillen, van S.M.E.; Hiddink, G.J.; Koelen, M.A.; Graaf, de C.; Woerkum, van C.M.J.

    2004-01-01

    Objective: For more effective nutrition communication, it is crucial to identify sources from which consumers seek information. Our purpose was to assess perceived relevance and information needs regarding food topics, and preferred information sources by means of quantitative consumer research.

  20. 76 FR 62451 - Avon Products, Inc., Including On-Site Leased Workers From Spherion/Source Right, Springdale...

    Science.gov (United States)

    2011-10-07

    ...., Including On-Site Leased Workers From Spherion/Source Right, Springdale, Ohio; Amended Certification... workers of the subject firm. The company reports that workers leased from Spherion/Source Right were...., including on-site leased workers from Spherion/Source Right, Springdale, Ohio, who became totally or...

  1. Quantitative spark-source analysis of UO2-PuO2 for rare earths and tantalum, tungsten

    International Nuclear Information System (INIS)

    Alkire, G.J.

    A quantitative analytical technique good to 20% for the determination of Sm, Eu, Gd, Dy, Ta, and W in Pu-U mixed oxides by spark source mass spectrography has been developed. The technique uses La as an added internal standard and an electronic integrator to measure peak areas of each line photometered on the densitometer. 3 tables

  2. Visible light scatter as quantitative information source on milk constituents

    DEFF Research Database (Denmark)

    Melentiyeva, Anastasiya; Kucheryavskiy, Sergey; Bogomolov, Andrey

    2012-01-01

    analysis. The main task here is to extract individual quantitative information on milk fat and total protein content from spectral data. This is particularly challenging problem in the case of raw natural milk, where the fat globule sizes may essentially differ depending on source. Fig. 1. Spots of light...... designed set of raw milk samples with simultaneously varying fat, total protein and particle size distribution has been analyzed in the Vis spectral region. The feasibility of raw milk analysis by PLS regression on spectral data has been proved. The root mean-square errors below 0.10% and 0.04% for fat....... 3J&M Analytik AG, Willy-Messerschmitt-Strasse 8, 73457 Essingen, Germany. bogomolov@j-m.de Fat and protein are two major milk nutrients that are routinely analyzed in the dairy industry. Growing food quality requirements promote the dissemination of spectroscopic analysis, enabling real...

  3. Estimating true human and animal host source contribution in quantitative microbial source tracking using the Monte Carlo method.

    Science.gov (United States)

    Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan

    2010-09-01

    Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and q

  4. Quantitative identification of nitrate pollution sources and uncertainty analysis based on dual isotope approach in an agricultural watershed.

    Science.gov (United States)

    Ji, Xiaoliang; Xie, Runting; Hao, Yun; Lu, Jun

    2017-10-01

    Quantitative identification of nitrate (NO 3 - -N) sources is critical to the control of nonpoint source nitrogen pollution in an agricultural watershed. Combined with water quality monitoring, we adopted the environmental isotope (δD-H 2 O, δ 18 O-H 2 O, δ 15 N-NO 3 - , and δ 18 O-NO 3 - ) analysis and the Markov Chain Monte Carlo (MCMC) mixing model to determine the proportions of riverine NO 3 - -N inputs from four potential NO 3 - -N sources, namely, atmospheric deposition (AD), chemical nitrogen fertilizer (NF), soil nitrogen (SN), and manure and sewage (M&S), in the ChangLe River watershed of eastern China. Results showed that NO 3 - -N was the main form of nitrogen in this watershed, accounting for approximately 74% of the total nitrogen concentration. A strong hydraulic interaction existed between the surface and groundwater for NO 3 - -N pollution. The variations of the isotopic composition in NO 3 - -N suggested that microbial nitrification was the dominant nitrogen transformation process in surface water, whereas significant denitrification was observed in groundwater. MCMC mixing model outputs revealed that M&S was the predominant contributor to riverine NO 3 - -N pollution (contributing 41.8% on average), followed by SN (34.0%), NF (21.9%), and AD (2.3%) sources. Finally, we constructed an uncertainty index, UI 90 , to quantitatively characterize the uncertainties inherent in NO 3 - -N source apportionment and discussed the reasons behind the uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Quantitative Real-Time PCR Fecal Source Identification in the ...

    Science.gov (United States)

    Rivers in the Tillamook Basin play a vital role in supporting a thriving dairy and cheese-making industry, as well as providing a safe water resource for local human and wildlife populations. Historical concentrations of fecal bacteria in these waters are at times too high to allow for safe use leading to economic loss, endangerment of local wildlife, and poor conditions for recreational use. In this study, we employ host-associated qPCR methods for human (HF183/BacR287 and HumM2), ruminant (Rum2Bac), cattle (CowM2 and CowM3), canine (DG3 and DG37), and avian (GFD) fecal pollution combined with high-resolution geographic information system (GIS) land use data and general indicator bacteria measurements to elucidatewater quality spatial and temporal trends. Water samples (n=584) were collected over a 1-year period at 29 sites along the Trask, Kilchis, and Tillamook rivers and tributaries (Tillamook Basin, OR). A total of 16.6% of samples (n=97) yielded E. coli levels considered impaired based on Oregon Department of Environmental Quality bacteria criteria (406 MPN/100mL). Hostassociated genetic indicators were detected at frequencies of 39.2% (HF183/BacR287), 16.3% (HumM2), 74.6% (Rum2Bac), 13.0% (CowM2), 26.7% (CowM3), 19.8% (DG3), 3.2% (DG37), and 53.4% (GFD) across all water samples (n=584). Seasonal trends in avian, cattle, and human fecal pollution sources were evident over the study area. On a sample site basis, quantitative fecal source identification and

  6. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  7. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    Energy Technology Data Exchange (ETDEWEB)

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  8. 76 FR 62452 - Avon Products, Inc. Including On-Site Leased Workers From Spherion/Source Right, Springdale, OH...

    Science.gov (United States)

    2011-10-07

    .... Including On-Site Leased Workers From Spherion/Source Right, Springdale, OH; Amended Certification Regarding... workers of the subject firm. The company reports that workers leased from Spherion/Source Right were...., including on-site leased workers from Spherion/Source Right, Springdale, Ohio, who became totally or...

  9. Quantitative identification and source apportionment of anthropogenic heavy metals in marine sediment of Hong Kong

    Science.gov (United States)

    Zhou, Feng; Guo, Huaicheng; Liu, Lei

    2007-10-01

    Based on ten heavy metals collected twice annually at 59 sites from 1998 to 2004, enrichment factors (EFs), principal component analysis (PCA) and multivariate linear regression of absolute principal component scores (MLR-APCS) were used in identification and source apportionment of the anthropogenic heavy metals in marine sediment. EFs with Fe as a normalizer and local background as reference values was properly tested and suitable in Hong Kong, and Zn, Ni, Pb, Cu, Cd, Hg and Cr mainly originated from anthropogenic sources, while Al, Mn and Fe were derived from rocks weathering. Rotated PCA and GIS mapping further identified two types of anthropogenic sources and their impacted regions: (1) electronic industrial pollution, riparian runoff and vehicle exhaust impacted the entire Victoria Harbour, inner Tolo Harbour, Eastern Buffer, inner Deep Bay and Cheung Chau; and (2) discharges from textile factories and paint, influenced Tsuen Wan Bay and Kwun Tong typhoon shelter and Rambler Channel. In addition, MLR-APCS was successfully introduced to quantitatively determine the source contributions with uncertainties almost less than 8%: the first anthropogenic sources were responsible for 50.0, 45.1, 86.6, 78.9 and 87.5% of the Zn, Pb, Cu, Cd and Hg, respectively, whereas 49.9% of the Ni and 58.4% of the Cr came from the second anthropogenic sources.

  10. Do the enigmatic ``Infrared-Faint Radio Sources'' include pulsars?

    Science.gov (United States)

    Hobbs, George; Middelberg, Enno; Norris, Ray; Keith, Michael; Mao, Minnie; Champion, David

    2009-04-01

    The Australia Telescope Large Area Survey (ATLAS) team have surveyed seven square degrees of sky at 1.4GHz. During processing some unexpected infrared-faint radio sources (IFRS sources) were discovered. The nature of these sources is not understood, but it is possible that some of these sources may be pulsars within our own galaxy. We propose to observe the IFRS sources with steep spectral indices using standard search techniques to determine whether or not they are pulsars. A pulsar detection would 1) remove a subset of the IFRS sources from the ATLAS sample so they would not need to be observed with large optical/IR telescopes to find their hosts and 2) be intrinsically interesting as the pulsar would be a millisecond pulsar and/or have an extreme spatial velocity.

  11. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Science.gov (United States)

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  12. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Directory of Open Access Journals (Sweden)

    Sho Manabe

    Full Text Available In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  13. Pseudodynamic Source Characterization for Strike-Slip Faulting Including Stress Heterogeneity and Super-Shear Ruptures

    KAUST Repository

    Mena, B.

    2012-08-08

    Reliable ground‐motion prediction for future earthquakes depends on the ability to simulate realistic earthquake source models. Though dynamic rupture calculations have recently become more popular, they are still computationally demanding. An alternative is to invoke the framework of pseudodynamic (PD) source characterizations that use simple relationships between kinematic and dynamic source parameters to build physically self‐consistent kinematic models. Based on the PD approach of Guatteri et al. (2004), we propose new relationships for PD models for moderate‐to‐large strike‐slip earthquakes that include local supershear rupture speed due to stress heterogeneities. We conduct dynamic rupture simulations using stochastic initial stress distributions to generate a suite of source models in the magnitude Mw 6–8. This set of models shows that local supershear rupture speed prevails for all earthquake sizes, and that the local rise‐time distribution is not controlled by the overall fault geometry, but rather by local stress changes on the faults. Based on these findings, we derive a new set of relations for the proposed PD source characterization that accounts for earthquake size, buried and surface ruptures, and includes local rise‐time variations and supershear rupture speed. By applying the proposed PD source characterization to several well‐recorded past earthquakes, we verify that significant improvements in fitting synthetic ground motion to observed ones is achieved when comparing our new approach with the model of Guatteri et al. (2004). The proposed PD methodology can be implemented into ground‐motion simulation tools for more physically reliable prediction of shaking in future earthquakes.

  14. Automatisation of reading and interpreting photographically recorded spark source mass spectra for the quantitative analysis in solids

    International Nuclear Information System (INIS)

    Naudin, Guy.

    1976-01-01

    Quantitative analysis in solids by spark source mass spectrometry involves the study of photographic plates by means of a microdensitometer. After a graphic treatment of data from the plate, a scientific program is used to calculate the concentrations of isotopes. The automatisation of the three parts has been realised by using a program for computer. This program has been written in the laboratory for a small computer (Multi 8, Intertechnique) [fr

  15. A practical algorithm for distribution state estimation including renewable energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher [Electronic and Electrical Department, Shiraz University of Technology, Modares Blvd., P.O. 71555-313, Shiraz (Iran); Firouzi, Bahman Bahmani [Islamic Azad University Marvdasht Branch, Marvdasht (Iran)

    2009-11-15

    Renewable energy is energy that is in continuous supply over time. These kinds of energy sources are divided into five principal renewable sources of energy: the sun, the wind, flowing water, biomass and heat from within the earth. According to some studies carried out by the research institutes, about 25% of the new generation will be generated by Renewable Energy Sources (RESs) in the near future. Therefore, it is necessary to study the impact of RESs on the power systems, especially on the distribution networks. This paper presents a practical Distribution State Estimation (DSE) including RESs and some practical consideration. The proposed algorithm is based on the combination of Nelder-Mead simplex search and Particle Swarm Optimization (PSO) algorithms, called PSO-NM. The proposed algorithm can estimate load and RES output values by Weighted Least-Square (WLS) approach. Some practical considerations are var compensators, Voltage Regulators (VRs), Under Load Tap Changer (ULTC) transformer modeling, which usually have nonlinear and discrete characteristics, and unbalanced three-phase power flow equations. The comparison results with other evolutionary optimization algorithms such as original PSO, Honey Bee Mating Optimization (HBMO), Neural Networks (NNs), Ant Colony Optimization (ACO), and Genetic Algorithm (GA) for a test system demonstrate that PSO-NM is extremely effective and efficient for the DSE problems. (author)

  16. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    International Nuclear Information System (INIS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-01-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10 -6 , comparable with the recent results reported in the literature

  17. Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source

    International Nuclear Information System (INIS)

    Anizan, N; Wahl, R L; Frey, E C; Wang, H; Zhou, X C

    2015-01-01

    Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1 cm in the measurement of the SCD from the assumed distance of 17 cm led to a variation of 1–2% in the calibration factor measurement using a small disc source (0.4 cm diameter) and less than 1% with a larger rod source (2.9 cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5 mm of the desired position. In this case, calibration source variability was limited by the quantum

  18. Controlled Carbon Source Addition to an Alternating Nitrification-Denitrification Wastewater Treatment Process Including Biological P Removal

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Henze, Mogens

    1995-01-01

    The paper investigates the effect of adding an external carbon source on the rate of denitrification in an alternating activated sludge process including biological P removal. Two carbon sources were examined, acetate and hydrolysate derived from biologically hydrolyzed sludge. Preliminary batch ...

  19. Qualitative and Quantitative Evaluation of Multi-source Piroxicam ...

    African Journals Online (AJOL)

    The qualitative and quantitative evaluation of eleven brands of piroxicam capsules marketed in Nigeria is presented. The disintegration time, dissolution rate and absolute drug content were determined in simulated intestinal fluid (SIF) and simulated gastric fluid (SGF) without enzymes. Weight uniformity test was also ...

  20. Track-a-worm, an open-source system for quantitative assessment of C. elegans locomotory and bending behavior.

    Directory of Open Access Journals (Sweden)

    Sijie Jason Wang

    Full Text Available A major challenge of neuroscience is to understand the circuit and gene bases of behavior. C. elegans is commonly used as a model system to investigate how various gene products function at specific tissue, cellular, and synaptic foci to produce complicated locomotory and bending behavior. The investigation generally requires quantitative behavioral analyses using an automated single-worm tracker, which constantly records and analyzes the position and body shape of a freely moving worm at a high magnification. Many single-worm trackers have been developed to meet lab-specific needs, but none has been widely implemented for various reasons, such as hardware difficult to assemble, and software lacking sufficient functionality, having closed source code, or using a programming language that is not broadly accessible. The lack of a versatile system convenient for wide implementation makes data comparisons difficult and compels other labs to develop new worm trackers. Here we describe Track-A-Worm, a system rich in functionality, open in source code, and easy to use. The system includes plug-and-play hardware (a stereomicroscope, a digital camera and a motorized stage, custom software written to run with Matlab in Windows 7, and a detailed user manual. Grayscale images are automatically converted to binary images followed by head identification and placement of 13 markers along a deduced spline. The software can extract and quantify a variety of parameters, including distance traveled, average speed, distance/time/speed of forward and backward locomotion, frequency and amplitude of dominant bends, overall bending activities measured as root mean square, and sum of all bends. It also plots worm travel path, bend trace, and bend frequency spectrum. All functionality is performed through graphical user interfaces and data is exported to clearly-annotated and documented Excel files. These features make Track-A-Worm a good candidate for implementation in

  1. Quantitative evaluation of SIMS spectra including spectrum interpretation and Saha-Eggert correction

    International Nuclear Information System (INIS)

    Steiger, W.; Ruedenauer, F.G.

    1978-01-01

    A spectrum identification program is described, using a computer algorithm which solely relies on the natural isotopic abundances for identification of elemental, molecular and cluster ions. The thermodynamic approach to the quantitative interpretation of SIMS spectra, through the use of the Saha-Eggert equation, is discussed, and a computer program is outlined. (U.K.)

  2. Quantitative analysis of directional spontaneous emission spectra from light sources in photonic crystals

    International Nuclear Information System (INIS)

    Nikolaev, Ivan S.; Lodahl, Peter; Vos, Willem L.

    2005-01-01

    We have performed angle-resolved measurements of spontaneous-emission spectra from laser dyes and quantum dots in opal and inverse opal photonic crystals. Pronounced directional dependencies of the emission spectra are observed: angular ranges of strongly reduced emission adjoin with angular ranges of enhanced emission. It appears that emission from embedded light sources is affected both by the periodicity and by the structural imperfections of the crystals: the photons are Bragg diffracted by lattice planes and scattered by unavoidable structural disorder. Using a model comprising diffuse light transport and photonic band structure, we quantitatively explain the directional emission spectra. This work provides detailed understanding of the transport of spontaneously emitted light in real photonic crystals, which is essential in the interpretation of quantum optics in photonic-band-gap crystals and for applications wherein directional emission and total emission power are controlled

  3. Performance of two quantitative PCR methods for microbial source tracking of human sewage and implications for microbial risk assessment in recreational waters

    Science.gov (United States)

    Before new, rapid quantitative PCR (qPCR) methods for recreational water quality assessment and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant soure has been...

  4. Interlaboratory comparison of three microbial source tracking quantitative polymerase chain reaction (qPCR) assays from fecal-source and environmental samples

    Science.gov (United States)

    Stelzer, Erin A.; Strickler, Kriston M.; Schill, William B.

    2012-01-01

    During summer and early fall 2010, 15 river samples and 6 fecal-source samples were collected in West Virginia. These samples were analyzed by three laboratories for three microbial source tracking (MST) markers: AllBac, a general fecal indicator; BacHum, a human-associated fecal indicator; and BoBac, a ruminant-associated fecal indicator. MST markers were analyzed by means of the quantitative polymerase chain reaction (qPCR) method. The aim was to assess interlaboratory precision when the three laboratories used the same MST marker and shared deoxyribonucleic acid (DNA) extracts of the samples, but different equipment, reagents, and analyst experience levels. The term assay refers to both the markers and the procedure differences listed above. Interlaboratory precision was best for all three MST assays when using the geometric mean absolute relative percent difference (ARPD) and Friedman's statistical test as a measure of interlaboratory precision. Adjustment factors (one for each MST assay) were calculated using results from fecal-source samples analyzed by all three laboratories and applied retrospectively to sample concentrations to account for differences in qPCR results among labs using different standards and procedures. Following the application of adjustment factors to qPCR results, ARPDs were lower; however, statistically significant differences between labs were still observed for the BacHum and BoBac assays. This was a small study and two of the MST assays had 52 percent of samples with concentrations at or below the limit of accurate quantification; hence, more testing could be done to determine if the adjustment factors would work better if the majority of sample concentrations were above the quantification limit.

  5. Alternative Energy Sources

    CERN Document Server

    Michaelides, Efstathios E (Stathis)

    2012-01-01

    Alternative Energy Sources is designed to give the reader, a clear view of the role each form of alternative energy may play in supplying the energy needs of the human society in the near and intermediate future (20-50 years).   The two first chapters on energy demand and supply and environmental effects, set the tone as to why the widespread use of alternative energy is essential for the future of human society. The third chapter exposes the reader to the laws of energy conversion processes, as well as the limitations of converting one energy form to another. The sections on exergy give a succinct, quantitative background on the capability/potential of each energy source to produce power on a global scale. The fourth, fifth and sixth chapters are expositions of fission and fusion nuclear energy. The following five chapters (seventh to eleventh) include detailed descriptions of the most common renewable energy sources – wind, solar, geothermal, biomass, hydroelectric – and some of the less common sources...

  6. Human fecal source identification with real-time quantitative PCR

    Science.gov (United States)

    Waterborne diseases represent a significant public health risk worldwide, and can originate from contact with water contaminated with human fecal material. We describe a real-time quantitative PCR (qPCR) method that targets a Bacteroides dori human-associated genetic marker for...

  7. Quantitative EEG and Current Source Density Analysis of Combined Antiepileptic Drugs and Dopaminergic Agents in Genetic Epilepsy: Two Case Studies.

    Science.gov (United States)

    Emory, Hamlin; Wells, Christopher; Mizrahi, Neptune

    2015-07-01

    Two adolescent females with absence epilepsy were classified, one as attention deficit and the other as bipolar disorder. Physical and cognitive exams identified hypotension, bradycardia, and cognitive dysfunction. Their initial electroencephalograms (EEGs) were considered slightly slow, but within normal limits. Quantitative EEG (QEEG) data included relative theta excess and low alpha mean frequencies. A combined treatment of antiepileptic drugs with a catecholamine agonist/reuptake inhibitor was sequentially used. Both patients' physical and cognitive functions improved and they have remained seizure free. The clinical outcomes were correlated with statistically significant changes in QEEG measures toward normal Z-scores in both anterior and posterior regions. In addition, low resolution electromagnetic tomography (LORETA) Z-scored source correlation analyses of the initial and treated QEEG data showed normalized patterns, supporting a neuroanatomic resolution. This study presents preliminary evidence for a neurophysiologic approach to patients with absence epilepsy and comorbid disorders and may provide a method for further research. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  8. Intravenous streptokinase therapy in acute myocardial infarction: Assessment of therapy effects by quantitative 201Tl myocardial imaging (including SPECT) and radionuclide ventriculography

    International Nuclear Information System (INIS)

    Koehn, H.; Bialonczyk, C.; Mostbeck, A.; Frohner, K.; Unger, G.; Steinbach, K.

    1984-01-01

    To evaluate a potential beneficial effect of systemic streptokinase therapy in acute myocardial infarction, 36 patients treated with streptokinase intravenously were assessed by radionuclide ventriculography and quantitative 201 Tl myocardial imaging (including SPECT) in comparison with 18 conventionally treated patients. Patients after thrombolysis had significantly higher EF, PFR, and PER as well as fewer wall motion abnormalities compared with controls. These differences were also observed in the subset of patients with anterior wall infarction (AMI), but not in patients with inferior wall infarction (IMI). Quantitative 201 Tl imaging demonstrated significantly smaller percent myocardial defects and fewer pathological stress segments in patients with thrombolysis compared with controls. The same differences were also found in both AMI and IMI patients. Our data suggest a favorable effect of intravenous streptokinase on recovery of left ventricular function and myocardial salvage. Radionuclide ventriculography and quantitative 201 Tl myocardial imaging seem to be reliable tools for objective assessment of therapy effects. (orig.)

  9. Combining emission inventory and isotope ratio analyses for quantitative source apportionment of heavy metals in agricultural soil.

    Science.gov (United States)

    Chen, Lian; Zhou, Shenglu; Wu, Shaohua; Wang, Chunhui; Li, Baojie; Li, Yan; Wang, Junxiao

    2018-08-01

    Two quantitative methods (emission inventory and isotope ratio analysis) were combined to apportion source contributions of heavy metals entering agricultural soils in the Lihe River watershed (Taihu region, east China). Source apportionment based on the emission inventory method indicated that for Cd, Cr, Cu, Pb, and Zn, the mean percentage input from atmospheric deposition was highest (62-85%), followed by irrigation (12-27%) and fertilization (1-14%). Thus, the heavy metals were derived mainly from industrial activities and traffic emissions. For Ni the combined percentage input from irrigation and fertilization was approximately 20% higher than that from atmospheric deposition, indicating that Ni was mainly derived from agricultural activities. Based on isotope ratio analysis, atmospheric deposition accounted for 57-93% of Pb entering soil, with the mean value of 69.3%, which indicates that this was the major source of Pb entering soil in the study area. The mean contributions of irrigation and fertilization to Pb pollution of soil ranged from 0% to 10%, indicating that they played only a marginally important role. Overall, the results obtained using the two methods were similar. This study provides a reliable approach for source apportionment of heavy metals entering agricultural soils in the study area, and clearly have potential application for future studies in other regions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Artificial intelligence methods applied for quantitative analysis of natural radioactive sources

    International Nuclear Information System (INIS)

    Medhat, M.E.

    2012-01-01

    Highlights: ► Basic description of artificial neural networks. ► Natural gamma ray sources and problem of detections. ► Application of neural network for peak detection and activity determination. - Abstract: Artificial neural network (ANN) represents one of artificial intelligence methods in the field of modeling and uncertainty in different applications. The objective of the proposed work was focused to apply ANN to identify isotopes and to predict uncertainties of their activities of some natural radioactive sources. The method was tested for analyzing gamma-ray spectra emitted from natural radionuclides in soil samples detected by a high-resolution gamma-ray spectrometry based on HPGe (high purity germanium). The principle of the suggested method is described, including, relevant input parameters definition, input data scaling and networks training. It is clear that there is satisfactory agreement between obtained and predicted results using neural network.

  11. Multi-source quantitative photoacoustic tomography in a diffusive regime

    International Nuclear Information System (INIS)

    Bal, Guillaume; Ren, Kui

    2011-01-01

    Photoacoustic tomography (PAT) is a novel hybrid medical imaging technique that aims to combine the large contrast of optical coefficients with the high-resolution capabilities of ultrasound. We assume that the first step of PAT, namely the reconstruction of a map of absorbed radiation from ultrasound boundary measurement, has been done. We focus on quantitative photoacoustic tomography, which aims at quantitatively reconstructing the optical coefficients from knowledge of the absorbed radiation map. We present a non-iterative procedure to reconstruct such optical coefficients, namely the diffusion and absorption coefficients, and the Grüneisen coefficient when the propagation of radiation is modeled by a second-order elliptic equation. We show that PAT measurements allow us to uniquely reconstruct only two out of the above three coefficients, even when data are collected using an arbitrary number of radiation illuminations. We present uniqueness and stability results for the reconstructions of two such parameters and demonstrate the accuracy of the reconstruction algorithm with numerical reconstructions from two-dimensional synthetic data

  12. Qualitative and quantitative analysis of Dibenzofuran, Alkyldibenzofurans, and Benzo[b]naphthofurans in crude oils and source rock extracts

    Science.gov (United States)

    Meijun Li,; Ellis, Geoffrey S.

    2015-01-01

    Dibenzofuran (DBF), its alkylated homologues, and benzo[b]naphthofurans (BNFs) are common oxygen-heterocyclic aromatic compounds in crude oils and source rock extracts. A series of positional isomers of alkyldibenzofuran and benzo[b]naphthofuran were identified in mass chromatograms by comparison with internal standards and standard retention indices. The response factors of dibenzofuran in relation to internal standards were obtained by gas chromatography-mass spectrometry analyses of a set of mixed solutions with different concentration ratios. Perdeuterated dibenzofuran and dibenzothiophene are optimal internal standards for quantitative analyses of furan compounds in crude oils and source rock extracts. The average concentration of the total DBFs in oils derived from siliciclastic lacustrine rock extracts from the Beibuwan Basin, South China Sea, was 518 μg/g, which is about 5 times that observed in the oils from carbonate source rocks in the Tarim Basin, Northwest China. The BNFs occur ubiquitously in source rock extracts and related oils of various origins. The results of this work suggest that the relative abundance of benzo[b]naphthofuran isomers, that is, the benzo[b]naphtho[2,1-d]furan/{benzo[b]naphtho[2,1-d]furan + benzo[b]naphtho[1,2-d]furan} ratio, may be a potential molecular geochemical parameter to indicate oil migration pathways and distances.

  13. The mzqLibrary--An open source Java library supporting the HUPO-PSI quantitative proteomics standard.

    Science.gov (United States)

    Qi, Da; Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R

    2015-09-01

    The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML-based format, capable of representing data about two-dimensional features from LC-MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java-based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)-level quantification values from peptide-level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC-MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in-built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq-lib/. © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantitative identification of moisture sources over the Tibetan Plateau and the relationship between thermal forcing and moisture transport

    Science.gov (United States)

    Pan, Chen; Zhu, Bin; Gao, Jinhui; Kang, Hanqing; Zhu, Tong

    2018-02-01

    Despite the importance of the Tibetan Plateau (TP) to the surrounding water cycle, the moisture sources of the TP remain uncertain. In this study, the moisture sources of the TP are quantitatively identified based on a 33-year simulation with a horizontal resolution of 1.9° × 2.5° using the Community Atmosphere Model version 5.1 (CAM5.1), in which atmospheric water tracer technology is incorporated. Results demonstrate that the major moisture sources differ over the southern TP (STP) and northern TP (NTP). During the winter, Africa, the TP, and India are the dominant source regions, contributing nearly half of the water vapour over the STP. During the summer, the tropical Indian Ocean (TIO) supplies 28.5 ± 3.6% of the water vapour over the STP and becomes the dominant source region. The dominant moisture source regions of the water vapour over the NTP are Africa (19.0 ± 2.8%) during the winter and the TP (25.8 ± 2.4%) during the summer. The overall relative contribution of each source region to the precipitation is similar to the contribution to the water vapour over the TP. Like most models, CAM5.1 generally overestimates the precipitation over the TP, yielding uncertainty in the absolute contributions to the precipitation. Composite analyses exhibit significant variations in the TIO-supplied moisture transport and precipitation over the STP during the summer alongside anomalous TP heating. This relationship between moisture transport from the TIO and the TP heating primarily involves the dynamic change in the TIO-supplied moisture flux, which further controls the variation in the TIO-contributed precipitation over the STP.

  15. Free and Open Source Chemistry Software in Research of Quantitative Structure-Toxicity Relationship of Pesticides

    Directory of Open Access Journals (Sweden)

    Rastija Vesna

    2017-01-01

    Full Text Available Pesticides are toxic chemicals aimed for the destroying pest on crops. Numerous data evidence about toxicity of pesticides on aquatic organisms. Since pesticides with similar properties tend to have similar biological activities, toxicity may be predicted from structure. Their structure feature and properties are encoded my means of molecular descriptors. Molecular descriptors can capture quite simple two-dimensional (2D chemical structures to highly complex three-dimensional (3D chemical structures. Quantitative structure-toxicity relationship (QSTR method uses linear regression analyses for correlation toxicity of chemical with their structural feature using molecular descriptors. Molecular descriptors were calculated using open source software PaDEL and in-house built PyMOL plugin (PyDescriptor. PyDescriptor is a new script implemented with the commonly used visualization software PyMOL for calculation of a large and diverse set of easily interpretable molecular descriptors encoding pharmacophoric patterns and atomic fragments. PyDescriptor has several advantages like free and open source, can work on all major platforms (Windows, Linux, MacOS. QSTR method allows prediction of toxicity of pesticides without experimental assay. In the present work, QSTR analysis for toxicity of a dataset of mixtures of 5 classes of pesticides comprising has been performed.

  16. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    Science.gov (United States)

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  17. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    Directory of Open Access Journals (Sweden)

    David Bulczak

    2017-12-01

    Full Text Available In the last decade, Time-of-Flight (ToF range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF measurements for selected, purchasable materials in the near-infrared (NIR range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  18. Subjective Response to Foot-Fall Noise, Including Localization of the Source Position

    DEFF Research Database (Denmark)

    Brunskog, Jonas; Hwang, Ha Dong; Jeong, Cheol-Ho

    2011-01-01

    annoyance, using simulated binaural room impulse responses, with sources being a moving point source or a nonmoving surface source, and rooms being a room with a reverberation time of 0.5 s or an anechoic room. The paper concludes that no strong effect of the source localization on the annoyance can...

  19. A dozen useful tips on how to minimise the influence of sources of error in quantitative electron paramagnetic resonance (EPR) spectroscopy-A review

    International Nuclear Information System (INIS)

    Mazur, Milan

    2006-01-01

    The principal and the most important error sources in quantitative electron paramagnetic resonance (EPR) measurements arising from sample-associated factors are the influence of the variation of the sample material (dielectric constant), sample size and shape, sample tube wall thickness, and sample orientation and positioning within the microwave cavity on the EPR signal intensity. Variation in these parameters can cause significant and serious errors in the primary phase of quantitative EPR analysis (i.e., data acquisition). The primary aim of this review is to provide useful suggestions, recommendations and simple procedures to minimise the influence of such primary error sources in quantitative EPR measurements. According to the literature, as well as results obtained in our EPR laboratory, the following are recommendations for samples, which are compared in quantitative EPR studies: (i) the shape of all samples should be identical; (ii) the position of the sample/reference in the cavity should be identical; (iii) a special alignment procedure for precise sample positioning within the cavity should be adopted; (iv) a special/consistent procedure for sample packing for a powder material should be used; (v) the wall thickness of sample tubes should be identical; (vi) the shape and wall thickness of quartz Dewars, where used, should be identical; (vii) where possible a double TE 104 cavity should be used in quantitative EPR spectroscopy; (viii) the dielectric properties of unknown and standard samples should be as close as possible; (ix) sample length less than double the cavity length should be used; (x) the optimised sample geometry for the X-band cavity is a 30 mm-length capillary with i.d. less then 1.5 mm; (xi) use of commercially distributed software for post-recording spectra manipulation is a basic necessity; and (xii) the sample and laboratory temperature should be kept constant during measurements. When the above recommendations and procedures were used

  20. Novel Method To Identify Source-Associated Phylogenetic Clustering Shows that Listeria monocytogenes Includes Niche-Adapted Clonal Groups with Distinct Ecological Preferences

    DEFF Research Database (Denmark)

    Nightingale, K. K.; Lyles, K.; Ayodele, M.

    2006-01-01

    population are identified (TreeStats test). Analysis of sequence data for 120 L. monocytogenes isolates revealed evidence of clustering between isolates from the same source, based on the phylogenies inferred from actA and inlA (P = 0.02 and P = 0.07, respectively; SourceCluster test). Overall, the Tree...... are biologically valid. Overall, our data show that (i) the SourceCluster and TreeStats tests can identify biologically meaningful source-associated phylogenetic clusters and (ii) L. monocytogenes includes clonal groups that have adapted to infect specific host species or colonize nonhost environments......., including humans, animals, and food. If the null hypothesis that the genetic distances for isolates within and between source populations are identical can be rejected (SourceCluster test), then particular clades in the phylogenetic tree with significant overrepresentation of sequences from a given source...

  1. Quantitative and comparative visualization applied to cosmological simulations

    International Nuclear Information System (INIS)

    Ahrens, James; Heitmann, Katrin; Habib, Salman; Ankeny, Lee; McCormick, Patrick; Inman, Jeff; Armstrong, Ryan; Ma, Kwan-Liu

    2006-01-01

    Cosmological simulations follow the formation of nonlinear structure in dark and luminous matter. The associated simulation volumes and dynamic range are very large, making visualization both a necessary and challenging aspect of the analysis of these datasets. Our goal is to understand sources of inconsistency between different simulation codes that are started from the same initial conditions. Quantitative visualization supports the definition and reasoning about analytically defined features of interest. Comparative visualization supports the ability to visually study, side by side, multiple related visualizations of these simulations. For instance, a scientist can visually distinguish that there are fewer halos (localized lumps of tracer particles) in low-density regions for one simulation code out of a collection. This qualitative result will enable the scientist to develop a hypothesis, such as loss of halos in low-density regions due to limited resolution, to explain the inconsistency between the different simulations. Quantitative support then allows one to confirm or reject the hypothesis. If the hypothesis is rejected, this step may lead to new insights and a new hypothesis, not available from the purely qualitative analysis. We will present methods to significantly improve the Scientific analysis process by incorporating quantitative analysis as the driver for visualization. Aspects of this work are included as part of two visualization tools, ParaView, an open-source large data visualization tool, and Scout, an analysis-language based, hardware-accelerated visualization tool

  2. A study on the quantitative evaluation for the software included in digital systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Kang, H. G.; Lee, K. Y.; Park, J. K. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    In general, probabilistic safety analysis (PSA) has been used as one of the most important methods to evaluate the safety of NPPs. The PSA, because most of NPPs have been installed and used analog I and C systems, has been performed based on the hardware perspectives. In addition, since the tendency to use digital I and C systems including software instead of analog I and C systems is increasing, the needs of quantitative evaluation methods so as to perform PSA are also increasing. Nevertheless, several reasons such as software did not aged and it is very perplexed to estimate software failure rate due to its non-linearity, make the performance of PSA difficult. In this study, in order to perform PSA including software more efficiently, test-based software reliability estimation methods are reviewed to suggest a preliminary procedure that can provide reasonable guidances to quantify software failure rate. In addition, requisite activities to enhance applicability of the suggested procedure are also discussed. 67 refs., 11 figs., 5 tabs. (Author)

  3. CG/MS quantitation of diamondoid compounds in crude oils and petroleum products

    International Nuclear Information System (INIS)

    Yang, C.; Wang, Z.D.; Hollebone, B.P.; Fingas, M.; Peng, X.; Landriault, M.

    2006-01-01

    Diamondoids are a class of saturated hydrocarbons that consist of 3-dimensionally fused cyclohexane rings. Diamondoid compounds in petroleum are the result of carbonium ion rearrangements of cyclic precursors on clay superacids in the source rock during oil generation. They are considered to be a problem due to their deposition during production of reservoir fluids and transportation of natural gas, gas condensates and light crude oils. At high concentrations, and with changes in pressure and temperature, diamondoid compounds can segregate out of reservoir fluids during production. Environmental scientists have considered fingerprinting the diamondoid hydrocarbons as a forensic method for oil spill studies. Since diamondoid compounds are thermodynamically stable, they have potential applications in oil-source correlation and differentiation for cases where traditional biomarker terpanes and steranes are absent because of environmental weathering or refining of petroleum products. Although there is increased awareness of possible use of diamondoid compounds for source identification, there is no systematic approach for using these compounds. Quantitative surveys of the abundances of diamondoids are not available. Therefore, this study developed a reliable analytical method for quantitative diamondoid analysis. Gas chromatography/mass spectrometry (GC/MS) was used to quantitatively determine diamondoid compounds (adamantane, diamantane and their alkylated homologues) in 14 fresh crude oils and 23 refined petroleum products, including light and mid-range distillate fuels, residual fuels and lubricating oils collected from different sources. Results were compared with 2 types of biomarker compounds in oil saturated hydrocarbon fractions. Several diagnostic ratios of diamondoids were developed based on their concentrations. Their potential use for forensic oil spill source identification was evaluated. 24 refs., 8 tabs., 4 figs

  4. Semi-quantitative and simulation analyses of effects of {gamma} rays on determination of calibration factors of PET scanners with point-like {sup 22}Na sources

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Tomoyuki [School of Allied Health Sciences, Kitasato University, 1-15-1, Kitasato, Minamiku, Sagamihara, Kanagawa, 252-0373 (Japan); Sato, Yasushi [National Institute of Advanced Industrial Science and Technology, 1-1-1, Umezono, Tsukuba, Ibaraki, 305-8568 (Japan); Oda, Keiichi [Tokyo Metropolitan Institute of Gerontology, 1-1, Nakamachi, Itabashi, Tokyo, 173-0022 (Japan); Wada, Yasuhiro [RIKEN Center for Molecular Imaging Science, 6-7-3, Minamimachi, Minatoshima, Chuo, Kobe, Hyogo, 650-0047 (Japan); Murayama, Hideo [National Institute of Radiological Sciences, 4-9-1, Anagawa, Inage, Chiba, 263-8555 (Japan); Yamada, Takahiro, E-mail: hasegawa@kitasato-u.ac.jp [Japan Radioisotope Association, 2-28-45, Komagome, Bunkyo-ku, Tokyo, 113-8941 (Japan)

    2011-09-21

    The uncertainty of radioactivity concentrations measured with positron emission tomography (PET) scanners ultimately depends on the uncertainty of the calibration factors. A new practical calibration scheme using point-like {sup 22}Na radioactive sources has been developed. The purpose of this study is to theoretically investigate the effects of the associated 1.275 MeV {gamma} rays on the calibration factors. The physical processes affecting the coincidence data were categorized in order to derive approximate semi-quantitative formulae. Assuming the design parameters of some typical commercial PET scanners, the effects of the {gamma} rays as relative deviations in the calibration factors were evaluated by semi-quantitative formulae and a Monte Carlo simulation. The relative deviations in the calibration factors were less than 4%, depending on the details of the PET scanners. The event losses due to rejecting multiple coincidence events of scattered {gamma} rays had the strongest effect. The results from the semi-quantitative formulae and the Monte Carlo simulation were consistent and were useful in understanding the underlying mechanisms. The deviations are considered small enough to correct on the basis of precise Monte Carlo simulation. This study thus offers an important theoretical basis for the validity of the calibration method using point-like {sup 22}Na radioactive sources.

  5. The application of IBA techniques to air pollution source fingerprinting and source apportionment

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D., E-mail: dcz@ansto.gov.au; Stelcer, E.; Atanacio, A.; Crawford, J.

    2014-01-01

    IBA techniques have been used to measure elemental concentrations of more than 20 different elements found in fine particle (PM2.5) air pollution. These data together with their errors and minimum detectable limits were used in Positive Matrix Factorisation (PMF) analyses to quantitatively determine source fingerprints and their contributions to the total measured fine mass. Wind speed and direction back trajectory data from the global HYSPLIT codes were then linked to these PMF fingerprints to quantitatively identify the location of the sources.

  6. Teaching Children How to Include the Inversion Principle in Their Reasoning about Quantitative Relations

    Science.gov (United States)

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Bell, Daniel; Barros, Rossana

    2012-01-01

    The basis of this intervention study is a distinction between numerical calculus and relational calculus. The former refers to numerical calculations and the latter to the analysis of the quantitative relations in mathematical problems. The inverse relation between addition and subtraction is relevant to both kinds of calculus, but so far research…

  7. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  8. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  9. Hybrid Design of Electric Power Generation Systems Including Renewable Sources of Energy

    Science.gov (United States)

    Wang, Lingfeng; Singh, Chanan

    2008-01-01

    With the stricter environmental regulations and diminishing fossil-fuel reserves, there is now higher emphasis on exploiting various renewable sources of energy. These alternative sources of energy are usually environmentally friendly and emit no pollutants. However, the capital investments for those renewable sources of energy are normally high,…

  10. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  11. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Maths meets myths quantitative approaches to ancient narratives

    CERN Document Server

    MacCarron, Máirín; MacCarron, Pádraig

    2017-01-01

    With an emphasis on exploring measurable aspects of ancient narratives, Maths Meets Myths sets out to investigate age-old material with new techniques. This book collects, for the first time, novel quantitative approaches to studying sources from the past, such as chronicles, epics, folktales, and myths. It contributes significantly to recent efforts in bringing together natural scientists and humanities scholars in investigations aimed at achieving greater understanding of our cultural inheritance. Accordingly, each contribution reports on a modern quantitative approach applicable to narrative sources from the past, or describes those which would be amenable to such treatment and why they are important. This volume is a unique state-of-the-art compendium on an emerging research field which also addresses anyone with interests in quantitative approaches to humanities.

  13. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  14. A gate evaluation of the sources of error in quantitative90 Y PET.

    Science.gov (United States)

    Strydhorst, Jared; Carlier, Thomas; Dieudonné, Arnaud; Conti, Maurizio; Buvat, Irène

    2016-10-01

    Accurate reconstruction of the dose delivered by 90 Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose-response relationships for treatment of hepatocellular carcinoma with 90 Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of 90 Y PET. PET acquisitions of two phantoms-a NEMA PET phantom and the NEMA IEC PET body phantom-containing either 90 Y or 18 F were simulated using gate. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. The quantitative accuracy of the 90 Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated 90 Y data was slightly poorer than that for simulated 18 F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Simulations of 90 Y PET confirm that quantitative 90 Y is achievable with the same approach as that used for 18 F, and that there is likely very little margin for improvement by attempting to model aspects unique to 90 Y, such as the much higher random fraction or the presence of bremsstrahlung in the singles data. © 2016 American Association of Physicists in Medicine.

  15. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  16. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  17. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  18. Quantitative diagnosis of moisture sources and transport pathways for summer precipitation over the mid-lower Yangtze River Basin

    Science.gov (United States)

    Wang, Ning; Zeng, Xin-Min; Guo, Wei-Dong; Chen, Chaohui; You, Wei; Zheng, Yiqun; Zhu, Jian

    2018-04-01

    Using a moisture tracking model with 32-year reanalysis data and station precipitation observations, we diagnosed the sources of moisture for summer (June 1-August 31) precipitation in mid-lower reaches of the Yangtze River Basin (YRB). Results indicate the dominant role of oceanic evaporation compared to terrestrial evapotranspiration, and the previously overlooked southern Indian Ocean, as a source region, is found to contribute more moisture than the well-known Arabian Sea or Bay of Bengal. Terrestrial evapotranspiration appears to be important for summer precipitation, especially in early June when moisture contribution is more than 50%. The terrestrial contribution then decreases and is generally less than 40% after late June. The Indian Ocean is the most important oceanic source before mid-July, with its largest contribution during the period of heavy precipitation, while the Pacific Ocean becomes the more important oceanic source after mid-July. To quantitatively analyze paths of moisture transport to YRB, we proposed the Trajectory Frequency Method. The most intense branch of water vapor transport to YRB stretches from the Arabian Sea through the Bay of Bengal, the Indochina Peninsula, the South China Sea, and South China. The other main transport branches are westerly moisture fluxes to the south of the Tibetan Plateau, cross-equatorial flows north of Australia, and separate branches located in the north and equatorial Pacific Ocean. Significant intraseasonal variability for these branches is presented. Additionally, the importance of the South China Sea for moisture transport to YRB, especially from the sea areas, is emphasized.

  19. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  20. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  1. Inversion of Atmospheric Tracer Measurements, Localization of Sources

    Science.gov (United States)

    Issartel, J.-P.; Cabrit, B.; Hourdin, F.; Idelkadi, A.

    When abnormal concentrations of a pollutant are observed in the atmosphere, the question of its origin arises immediately. The radioactivity from Tchernobyl was de- tected in Sweden before the accident was announced. This situation emphasizes the psychological, political and medical stakes of a rapid identification of sources. In tech- nical terms, most industrial sources can be modeled as a fixed point at ground level with undetermined duration. The classical method of identification involves the cal- culation of a backtrajectory departing from the detector with an upstream integration of the wind field. We were first involved in such questions as we evaluated the ef- ficiency of the international monitoring network planned in the frame of the Com- prehensive Test Ban Treaty. We propose a new approach of backtracking based upon the use of retroplumes associated to available measurements. Firstly the retroplume is related to inverse transport processes, describing quantitatively how the air in a sam- ple originates from regions that are all the more extended and diffuse as we go back far in the past. Secondly it clarifies the sensibility of the measurement with respect to all potential sources. It is therefore calculated by adjoint equations including of course diffusive processes. Thirdly, the statistical interpretation, valid as far as sin- gle particles are concerned, should not be used to investigate the position and date of a macroscopic source. In that case, the retroplume rather induces a straightforward constraint between the intensity of the source and its position. When more than one measurements are available, including zero valued measurements, the source satisfies the same number of linear relations tightly related to the retroplumes. This system of linear relations can be handled through the simplex algorithm in order to make the above intensity-position correlation more restrictive. This method enables to manage in a quantitative manner the

  2. A GATE evaluation of the sources of error in quantitative {sup 90}Y PET

    Energy Technology Data Exchange (ETDEWEB)

    Strydhorst, Jared, E-mail: jared.strydhorst@gmail.com; Buvat, Irène [IMIV, U1023 Inserm/CEA/Université Paris-Sud and ERL 9218 CNRS, Université Paris-Saclay, CEA/SHFJ, Orsay 91401 (France); Carlier, Thomas [Department of Nuclear Medicine, Centre Hospitalier Universitaire de Nantes and CRCNA, Inserm U892, Nantes 44000 (France); Dieudonné, Arnaud [Department of Nuclear Medicine, Hôpital Beaujon, HUPNVS, APHP and Inserm U1149, Clichy 92110 (France); Conti, Maurizio [Siemens Healthcare Molecular Imaging, Knoxville, Tennessee, 37932 (United States)

    2016-10-15

    Purpose: Accurate reconstruction of the dose delivered by {sup 90}Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose–response relationships for treatment of hepatocellular carcinoma with {sup 90}Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of {sup 90}Y PET. Methods: PET acquisitions of two phantoms—a NEMA PET phantom and the NEMA IEC PET body phantom-containing either {sup 90}Y or {sup 18}F were simulated using GATE. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. Results: The quantitative accuracy of the {sup 90}Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated {sup 90}Y data was slightly poorer than that for simulated {sup 18}F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Conclusions: Simulations of {sup 90}Y PET confirm that quantitative {sup 90}Y is achievable with the same approach as that used for {sup 18}F, and that there is likely very little margin for improvement by attempting to model aspects unique to {sup 90}Y, such as the much higher random fraction or the presence of

  3. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    Science.gov (United States)

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  4. Quantitative assessment of pure aortic valve regurgitation with dual-source CT

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z., E-mail: lzlcd01@126.com [Department of Radiology, West China Hospital, Sichuan University, 37 Guo Xue Xiang, Chengdu, Sichuan 610041 (China); Huang, L.; Chen, X.; Xia, C.; Yuan, Y.; Shuai, T. [Department of Radiology, West China Hospital, Sichuan University, 37 Guo Xue Xiang, Chengdu, Sichuan 610041 (China)

    2012-07-15

    Aim: To assess the severity of pure aortic regurgitation by measuring regurgitation volumes (RV) and fractions (RF) with dual-source computed tomography (DSCT) as compared to magnetic resonance imaging (MRI) and echocardiography. Materials and methods: Thirty-eight patients (15 men, 23 women; mean age 46 {+-} 11 years) with isolated aortic valve regurgitation underwent retrospectively electrocardiogram (ECG)-gated DSCT, echocardiography, and MRI. Stroke volumes of the left and right ventricles were measured at DSCT and MRI. Thus, RVs and RFs were calculated and compared. The agreement between DSCT and MRI was tested by intraclass correlation coefficient and Bland-Altman analyses. Spearman's rank order correlation and weighted {kappa} tests were used for testing correlations of AR severity between DSCT results and corresponding echocardiographic grades. Results: The RV and RF measured by DSCT were not significantly different from those measured using MRI (p = 0.71 and 0.79). DSCT correlated well with MRI for the measurement of RV (r{sub I} = 0.86, p<0.001) and calculation of the RF (r{sub I} =0.90, p<0.001). Good agreement between the techniques was obtained by using Bland-Altman analyses. The severity of regurgitation estimated by echocardiography correlated well with DSCT (r{sub s} = 0.95, p<0.001) and MRI (r{sub s} = 0.95, p<0.001). Inter-technique agreement between DSCT and two-dimensional transthoracic echocardiography (2DTTE) regarding the grading of the severity of AR was excellent ({kappa} = 0.90), and good agreement was also obtained between MRI and 2DTTE assessments of the severity of AR ({kappa} = 0.87). Conclusion: DSCT using a volume approach can be used to quantitatively determine the severity of pure aortic regurgitation when compared with MRI and echocardiography.

  5. A quantitative assessment method for the NPP operators' diagnosis of accidents

    International Nuclear Information System (INIS)

    Kim, M. C.; Seong, P. H.

    2003-01-01

    In this research, we developed a quantitative model for the operators' diagnosis of the accident situation when an accident occurs in a nuclear power plant. After identifying the occurrence probabilities of accidents, the unavailabilities of various information sources, and the causal relationship between accidents and information sources, Bayesian network is used for the analysis of the change in the occurrence probabilities of accidents as the operators receive the information related to the status of the plant. The developed method is applied to a simple example case and it turned out that the developed method is a systematic quantitative analysis method which can cope with complex relationship between the accidents and information sources and various variables such accident occurrence probabilities and unavailabilities of various information sources

  6. Quantitative assessment of source contributions to PM2.5 on the west coast of Peninsular Malaysia to determine the burden of Indonesian peatland fire

    Science.gov (United States)

    Fujii, Yusuke; Tohno, Susumu; Amil, Norhaniza; Latif, Mohd Talib

    2017-12-01

    Almost every dry season, peatland fires occur in Sumatra and Kalimantan Inlands. Dense smoke haze from Indonesian peatland fires (IPFs) causes impacts on health, visibility, transport and regional climate in Southeast Asian countries such as Indonesia, Malaysia, and Singapore. Quantitative knowledge of IPF source contribution to ambient aerosols in Southeast Asia (SEA) is so useful to make appropriate suggestions to policy makers to mitigate IPF-induced haze pollution. However, its quantitative contribution to ambient aerosols in SEA remains unclarified. In this study, the source contributions to PM2.5 were determined by the Positive Matrix Factorization (PMF) model with annual comprehensive observation data at Petaling Jaya on the west coast of Peninsular Malaysia, which is downwind of the IPF areas in Sumatra Island, during the dry (southwest monsoon: June-September) season. The average PM2.5 mass concentration during the whole sampling periods (Aug 2011-Jul 2012) based on the PMF and chemical mass closure models was determined as 20-21 μg m-3. Throughout the sampling periods, IPF contributed (on average) 6.1-7.0 μg m-3 to the PM2.5, or ∼30% of the retrieved PM2.5 concentration. In particular, the PM2.5 was dominantly sourced from IPF during the southwest monsoon season (51-55% of the total PM2.5 concentration on average). Thus, reducing the IPF burden in the PM2.5 levels would drastically improve the air quality (especially during the southwest monsoon season) around the west coast of Peninsular Malaysia.

  7. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  8. Improvement of quantitation in SPECT: Attenuation and scatter correction using non-uniform attenuation data

    International Nuclear Information System (INIS)

    Mukai, T.; Torizuka, K.; Douglass, K.H.; Wagner, H.N.

    1985-01-01

    Quantitative assessment of tracer distribution with single photon emission computed tomography (SPECT) is difficult because of attenuation and scattering of gamma rays within the object. A method considering the source geometry was developed, and effects of attenuation and scatter on SPECT quantitation were studied using phantoms with non-uniform attenuation. The distribution of attenuation coefficients (μ) within the source were obtained by transmission CT. The attenuation correction was performed by an iterative reprojection technique. The scatter correction was done by convolution of the attenuation corrected image and an appropriate filter made by line source studies. The filter characteristics depended on μ and SPEC measurement at each pixel. The SPECT obtained by this method showed the most reasonable results than the images reconstructed by other methods. The scatter correction could compensate completely for a 28% scatter components from a long line source, and a 61% component for thick and extended source. Consideration of source geometries was necessary for effective corrections. The present method is expected to be valuable for the quantitative assessment of regional tracer activity

  9. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  10. New generation quantitative x-ray microscopy encompassing phase-contrast

    International Nuclear Information System (INIS)

    Wilkins, S.W.; Mayo, S.C.; Gureyev, T.E.; Miller, P.R.; Pogany, A.; Stevenson, A.W.; Gao, D.; Davis, T.J.; Parry, D.J.; Paganin, D.

    2000-01-01

    Full text: We briefly outline a new approach to X-ray ultramicroscopy using projection imaging in a scanning electron microscope (SEM). Compared to earlier approaches, the new approach offers spatial resolution of ≤0.1 micron and includes novel features such as: i) phase contrast to give additional sample information over a wide energy range, rapid phase/amplitude extraction algorithms to enable new real-time modes of microscopic imaging widespread applications are envisaged to fields such as materials science, biomedical research, and microelectronics device inspection. Some illustrative examples are presented. The quantitative methods described here are also very relevant to X-ray projection microscopy using synchrotron sources

  11. Swept source optical coherence tomography for quantitative and qualitative assessment of dental composite restorations

    Science.gov (United States)

    Sadr, Alireza; Shimada, Yasushi; Mayoral, Juan Ricardo; Hariri, Ilnaz; Bakhsh, Turki A.; Sumi, Yasunori; Tagami, Junji

    2011-03-01

    The aim of this work was to explore the utility of swept-source optical coherence tomography (SS-OCT) for quantitative evaluation of dental composite restorations. The system (Santec, Japan) with a center wavelength of around 1300 nm and axial resolution of 12 μm was used to record data during and after placement of light-cured composites. The Fresnel phenomenon at the interfacial defects resulted in brighter areas indicating gaps as small as a few micrometers. The gap extension at the interface was quantified and compared to the observation by confocal laser scanning microscope after trimming the specimen to the same cross-section. Also, video imaging of the composite during polymerization could provide information about real-time kinetics of contraction stress and resulting gaps, distinguishing them from those gaps resulting from poor adaptation of composite to the cavity prior to polymerization. Some samples were also subjected to a high resolution microfocus X-ray computed tomography (μCT) assessment; it was found that differentiation of smaller gaps from the radiolucent bonding layer was difficult with 3D μCT. Finally, a clinical imaging example using a newly developed dental SS-OCT system with an intra-oral scanning probe (Panasonic Healthcare, Japan) is presented. SS-OCT is a unique tool for clinical assessment and laboratory research on resin-based dental restorations. Supported by GCOE at TMDU and NCGG.

  12. Sources of Free and Open Source Spatial Data for Natural Disasters and Principles for Use in Developing Country Contexts

    Science.gov (United States)

    Taylor, Faith E.; Malamud, Bruce D.; Millington, James D. A.

    2016-04-01

    Access to reliable spatial and quantitative datasets (e.g., infrastructure maps, historical observations, environmental variables) at regional and site specific scales can be a limiting factor for understanding hazards and risks in developing country settings. Here we present a 'living database' of >75 freely available data sources relevant to hazard and risk in Africa (and more globally). Data sources include national scientific foundations, non-governmental bodies, crowd-sourced efforts, academic projects, special interest groups and others. The database is available at http://tinyurl.com/africa-datasets and is continually being updated, particularly in the context of broader natural hazards research we are doing in the context of Malawi and Kenya. For each data source, we review the spatiotemporal resolution and extent and make our own assessments of reliability and usability of datasets. Although such freely available datasets are sometimes presented as a panacea to improving our understanding of hazards and risk in developing countries, there are both pitfalls and opportunities unique to using this type of freely available data. These include factors such as resolution, homogeneity, uncertainty, access to metadata and training for usage. Based on our experience, use in the field and grey/peer-review literature, we present a suggested set of guidelines for using these free and open source data in developing country contexts.

  13. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    Science.gov (United States)

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  14. Problems of standardized handling and quantitative evaluation of autoradiograms

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1985-01-01

    In the last years autoradiography has gained increasing importance as a quantitative method of measuring radioactivity or element concentration. Mostly relative measurements are carried out. The optical density of the photographic emulsion produced by a calibrated radiation source is compared with that produced by a sample. The influences of different parameters, such as beta particle energy, backscattering, fading of the latent image, developing conditions, matrix effects and others on the results are described and the errors of the quantitative evaluation of autoradiograms are assessed. The performance of the method is demonstrated taking the quantitative determination of gold in silicon as an example

  15. Early Quantitative Assessment of Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormandjieva, O.

    2007-01-01

    Non-functional requirements (NFRs) of software systems are a well known source of uncertainty in effort estimation. Yet, quantitatively approaching NFR early in a project is hard. This paper makes a step towards reducing the impact of uncertainty due to NRF. It offers a solution that incorporates

  16. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  17. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  18. Multi-photon absorption limits to heralded single photon sources

    Science.gov (United States)

    Husko, Chad A.; Clark, Alex S.; Collins, Matthew J.; De Rossi, Alfredo; Combrié, Sylvain; Lehoucq, Gaëlle; Rey, Isabella H.; Krauss, Thomas F.; Xiong, Chunle; Eggleton, Benjamin J.

    2013-01-01

    Single photons are of paramount importance to future quantum technologies, including quantum communication and computation. Nonlinear photonic devices using parametric processes offer a straightforward route to generating photons, however additional nonlinear processes may come into play and interfere with these sources. Here we analyse spontaneous four-wave mixing (SFWM) sources in the presence of multi-photon processes. We conduct experiments in silicon and gallium indium phosphide photonic crystal waveguides which display inherently different nonlinear absorption processes, namely two-photon (TPA) and three-photon absorption (ThPA), respectively. We develop a novel model capturing these diverse effects which is in excellent quantitative agreement with measurements of brightness, coincidence-to-accidental ratio (CAR) and second-order correlation function g(2)(0), showing that TPA imposes an intrinsic limit on heralded single photon sources. We build on these observations to devise a new metric, the quantum utility (QMU), enabling further optimisation of single photon sources. PMID:24186400

  19. Study on the quantitative relationship between Agricultural water and fertilization process and non-point source pollution based on field experiments

    Science.gov (United States)

    Wang, H.; Chen, K.; Wu, Z.; Guan, X.

    2017-12-01

    In recent years, with the prominent of water environment problem and the relative increase of point source pollution governance, especially the agricultural non-point source pollution problem caused by the extensive use of fertilizers and pesticides has become increasingly aroused people's concern and attention. In order to reveal the quantitative relationship between agriculture water and fertilizer and non-point source pollution, on the basis of elm field experiment and combined with agricultural drainage irrigation model, the agricultural irrigation water and the relationship between fertilizer and fertilization scheme and non-point source pollution were analyzed and calculated by field emission intensity index. The results show that the variation of displacement varies greatly under different irrigation conditions. When the irrigation water increased from 22cm to 42cm, the irrigation water increased by 20 cm while the field displacement increased by 11.92 cm, about 66.22% of the added value of irrigation water. Then the irrigation water increased from 42 to 68, irrigation water increased 26 cm, and the field displacement increased by 22.48 cm, accounting for 86.46% of irrigation water. So there is an "inflection point" between the irrigation water amount and field displacement amount. The load intensity increases with the increase of irrigation water and shows a significant power correlation. Under the different irrigation condition, the increase amplitude of load intensity with the increase of irrigation water is different. When the irrigation water is smaller, the load intensity increase relatively less, and when the irrigation water increased to about 42 cm, the load intensity will increase considerably. In addition, there was a positive correlation between the fertilization and load intensity. The load intensity had obvious difference in different fertilization modes even with same fertilization level, in which the fertilizer field unit load intensity

  20. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  1. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  2. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  3. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  4. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Directory of Open Access Journals (Sweden)

    Rígel Licier

    2016-10-01

    Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  5. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  6. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  7. Material-specific Conversion Factors for Different Solid Phantoms Used in the Dosimetry of Different Brachytherapy Sources

    Directory of Open Access Journals (Sweden)

    Sedigheh Sina

    2015-07-01

    Full Text Available Introduction Based on Task Group No. 43 (TG-43U1 recommendations, water phantom is proposed as a reference phantom for the dosimetry of brachytherapy sources. The experimental determination of TG-43 parameters is usually performed in water-equivalent solid phantoms. The purpose of this study was to determine the conversion factors for equalizing solid phantoms to water. Materials and Methods TG-43 parameters of low- and high-energy brachytherapy sources (i.e., Pd-103, I-125 and Cs-137 were obtained in different phantoms, using Monte Carlo simulations. The brachytherapy sources were simulated at the center of different phantoms including water, solid water, poly(methyl methacrylate, polystyrene and polyethylene. Dosimetric parameters such as dose rate constant, radial dose function and anisotropy function of each source were compared in different phantoms. Then, conversion factors were obtained to make phantom parameters equivalent to those of water. Results Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water. Conclusion Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water.

  8. Quantitative analysis of Internet television and video (WebTV: A study of formats, content, and source

    Directory of Open Access Journals (Sweden)

    José Borja ARJONA MARTÍN

    2014-07-01

    Full Text Available Due to the significant increase in the last five years of audiovisual content distribution over the web, this paper is focused on a study aimed at the description and classification of a wide sample of audiovisual initiatives whose access is carried out by means of the World Wide Web. The purpose of this study is to promote the debate concerning the different names of these incipient media, as well as their categorization and description so that an organised universe of the WebTV phenomenon could be provided. An analysis of formats and content is carried out on the basis of quantitative techniques in order to propose a categorization typology. These formats and content will be studied under three key variables: "Content", "Origin" and "Domain .tv". "Content" will help us define the programmatic lines of our study sample; “Source” refers to the source of a particular item of study (“Native WebTV or WebTV representative of a conventional media and "Domain.tv" will specify the proportion of case studies hosted with domain .tv. The results obtained in this study will offer the researchers and the professionals a comprehensive description of the models currently adopted in the field of video and television on the net.

  9. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    Science.gov (United States)

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  10. Monte Carlo calculations and neutron spectrometry in quantitative prompt gamma neutron activation analysis (PGNAA) of bulk samples using an isotopic neutron source

    International Nuclear Information System (INIS)

    Spyrou, N.M.; Awotwi-Pratt, J.B.; Williams, A.M.

    2004-01-01

    An activation analysis facility based on an isotopic neutron source (185 GBq 241 Am/Be) which can perform both prompt and cyclic activation analysis on bulk samples, has been used for more than 20 years in many applications including 'in vivo' activation analysis and the determination of the composition of bio-environmental samples, such as, landfill waste and coal. Although the comparator method is often employed, because of the variety in shape, size and elemental composition of these bulk samples, it is often difficult and time consuming to construct appropriate comparator samples for reference. One of the obvious problems is the distribution and energy of the neutron flux in these bulk and comparator samples. In recent years, it was attempted to adopt the absolute method based on a monostandard and to make calculations using a Monte Carlo code (MCNP4C2) to explore this further. In particular, a model of the irradiation facility has been made using the MCNP4C2 code in order to investigate the factors contributing to the quantitative determination of the elemental concentrations through prompt gamma neutron activation analysis (PGNAA) and most importantly, to estimate how the neutron energy spectrum and neutron dose vary with penetration depth into the sample. This simulation is compared against the scattered and transmitted neutron energy spectra that are experimentally and empirically determined using a portable neutron spectrometry system. (author)

  11. In vivo regional quantitation of intrathoracic /sup 99m/Tc using SPECT: concise communication

    International Nuclear Information System (INIS)

    Osborne, D.; Jaszczak, R.; Coleman, R.E.; Greer, K.; Lischko, M.

    1982-01-01

    A whole-body single-photon emission computed tomographic system (SPECT) was used to quantitate the activities of a series of /sup 99m/Tc point sources in the dog's thorax and to evaluate attenuation of a uniform esophageal line source containing a known concentration of /sup 99m/Tc. A first-order attenuation correction and an empirically derived attenuation coefficient of 0.09 cm-1 were used in the SPECT analyses of the intrathoracic point sources. The relationship between SPECT measurements of multiple point-source activities and the same sources measured in air was linear over a range of 100 to 1000 muCi (slope 1.08; R2 coefficient of determination 0.97). These data are sufficiently accurate to allow an estimate of the regional activity of radiopharmaceutical in the dog's thorax and justify their use in experimental quantitation of regional pulmonary perfusion

  12. A Madison-Numeracy Citation Index (2008-2015: Implementing a Vision for a Quantitatively Literate World

    Directory of Open Access Journals (Sweden)

    Nathan D. Grawe

    2017-01-01

    Full Text Available This editorial recognizes the contributions made by Bernard Madison to the field of quantitative literacy with a bibliographic index of his papers, edited volumes, and works contained therein that were cited in the first eight volumes (2008-2015 of Numeracy. In total, 61 citing papers ("sources" cite 42 Madison works ("citations" a total of 218 times. The source and citation indexes provided in the appendix at the end of this editorial make it easy to see the direct contribution of Madison's work to the arguments and debates contained in the founding years of the journal. For those who are new to the field of quantitative reasoning, the citation index also provides an essential reading list. Most of the citations and sources are open-access and links within the indexes aid easy access to Madison's important contribution to Numeracy and the quantitative literacy movement.

  13. Quantitation of mandibular symphysis volume as a source of bone grafting.

    Science.gov (United States)

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2010-06-01

    Autogenous intramembranous bone graft present several advantages such as minimal resorption and high concentration of bone morphogenetic proteins. A method for measuring the amount of bone that can be harvested from the symphysis area has not been reported in real patients. The aim of the present study was to intrasurgically quantitate the volume of the symphysis bone graft that can be safely harvested in live patients and compare it with AutoCAD (version 16.0, Autodesk, Inc., San Rafael, CA, USA) tomographic calculations. AutoCAD software program quantitated symphysis bone graft in 40 patients using computerized tomographies. Direct intrasurgical measurements were recorded thereafter and compared with AutoCAD data. The bone volume was measured at the recipient sites of a subgroup of 10 patients, 6 months post sinus augmentation. The volume of bone graft measured by AutoCAD averaged 1.4 mL (SD 0.6 mL, range: 0.5-2.7 mL). The volume of bone graft measured intrasurgically averaged 2.3 mL (SD 0.4 mL, range 1.7-2.8 mL). The statistical difference between the two measurement methods was significant. The bone volume measured at the recipient sites 6 months post sinus augmentation averaged 1.9 mL (SD 0.3 mL, range 1.3-2.6 mL) with a mean loss of 0.4 mL. AutoCAD did not overestimate the volume of bone that can be safely harvested from the mandibular symphysis. The use of the design software program may improve surgical treatment planning prior to sinus augmentation.

  14. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  15. The History of Cartographic Sources Development

    Directory of Open Access Journals (Sweden)

    L. Volkotrub

    2016-07-01

    Full Text Available Cartographic sources are the variety of descriptive sources. They include historical and geographical maps and circuits maps. The image maps are a special kind of modeling the real phenomenon, that broadcasts their quantitative and qualitative characteristics, structure, interconnections and dynamic in a graphic form. The prototypes of maps appeared as a way of transmitting information around the world. People began to use this way of communication long before the appearance of writing. The quality of mapping images matched with the evolution of techniques and methods of mapping and publishing. The general development of cartographic sources is determined primarily by three factors: the development of science and technology, the needs of society in different cartographic works, political and economic situation of country. Given this, map is an all-sufficient phenomenon, its sources expert study is based on understanding of invariance of its perception. Modern theoretical concepts show us the invariance of maps. Specifially, map is viewed in the following aspects: 1 it is one of the universal models of land and existing natural and social processes.2 it is one of the tools of researching and forecasting. 3 it is a specific language formation. 4 it is a method of transferring information. As a source map may contain important information about physical geography, geology, hydrology, political-administrative division, population, flora and fauna of a particular area in a particular period. Mostly, cartographic sources are complex, because they contain a lot of cognitive and historical information.

  16. Krakow conference on low emissions sources: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, B.L.; Butcher, T.A. [eds.

    1995-12-31

    The Krakow Conference on Low Emission Sources presented the information produced and analytical tools developed in the first phase of the Krakow Clean Fossil Fuels and Energy Efficiency Program. This phase included: field testing to provide quantitative data on missions and efficiencies as well as on opportunities for building energy conservation; engineering analysis to determine the costs of implementing pollution control; and incentives analysis to identify actions required to create a market for equipment, fuels, and services needed to reduce pollution. Collectively, these Proceedings contain reports that summarize the above phase one information, present the status of energy system management in Krakow, provide information on financing pollution control projects in Krakow and elsewhere, and highlight the capabilities and technologies of Polish and American companies that are working to reduce pollution from low emission sources. It is intended that the US reader will find in these Proceedings useful results and plans for control of pollution from low emission sources that are representative of heating systems in central and Eastern Europe. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  17. Optimization of atomic beam sources for polarization experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gaisser, Martin; Nass, Alexander; Stroeher, Hans [IKP, Forschungszentrum Juelich (Germany)

    2013-07-01

    For experiments with spin-polarized protons and neutrons a dense target is required. In current atomic beam sources an atomic hydrogen or deuterium beam is expanded through a cold nozzle and a system of sextupole magnets and RF-transition units selects a certain hyperfine state. The achievable flux seems to be limited to about 10{sup 17} particles per second with a high nuclear polarization. A lot of experimental and theoretical effort has been undertaken to understand all effects and to increase the flux. However, improvements have remained marginal. Now, a Monte Carlo simulation based on the DSMC part of the open source C++ library OpenFOAM is set up in order to get a better understanding of the flow and to optimize the various elements. It is intended to include important effects like deflection from magnetic fields, recombination on the walls and spin exchange collisions in the simulation and make quantitative predictions of changes in the experimental setup. The goal is to get a tool that helps to further increase the output of an atomic beam source. So far, a new binary collision model, magnetic fields, RF-transition units and a tool to measure the collision age are included. The next step will be to couple the whole simulation with an optimization algorithm implementing Adaptive Simulated Annealing (ASA) in order to automatically optimize the atomic beam source.

  18. A trial fabrication of activity standard surface sources and positional standard surface sources for an imaging plate system

    International Nuclear Information System (INIS)

    Sato, Yasushi; Hino, Yoshio; Yamada, Takahiro; Matsumoto, Mikio

    2003-01-01

    An imaging plate system can detect low level activity, but quantitative analysis is difficult because there are no adequate standard surface sources. A new fabrication method was developed for standard surface sources by printing on a sheet of paper using an ink-jet printer with inks in which a radioactive material was mixed. The fabricated standard surface sources had high uniformity, high positional resolution arbitrary shapes and a broad intensity range. The standard sources were used for measurement of surface activity as an application. (H. Yokoo)

  19. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  20. Allometric Trajectories and "Stress": A Quantitative Approach.

    Science.gov (United States)

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  1. Nonradioactive Environmental Emissions Chemical Source Term for the Double-Shell Tank (DST) Vapor Space During Waste Retrieval Operations

    International Nuclear Information System (INIS)

    MAY, T.H.

    2000-01-01

    A nonradioactive chemical vapor space source term for tanks on the Phase 1 and the extended Phase 1 delivery, storage, and disposal mission was determined. Operations modeled included mixer pump operation and DST waste transfers. Concentrations of ammonia, specific volatile organic compounds, and quantitative volumes of aerosols were estimated

  2. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  3. Quantitative trait loci associated with anthracnose resistance in sorghum

    Science.gov (United States)

    With an aim to develop a durable resistance to the fungal disease anthracnose, two unique genetic sources of resistance were selected to create genetic mapping populations to identify regions of the sorghum genome that encode anthracnose resistance. A series of quantitative trait loci were identifi...

  4. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    Science.gov (United States)

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  5. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    Science.gov (United States)

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.

  6. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  7. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  8. Laser Gas-Analyser for Monitoring a Source of Gas Pollution

    Directory of Open Access Journals (Sweden)

    V. A. Gorodnichev

    2015-01-01

    Full Text Available Currently, the problem of growing air pollution of the Earth is of relevance. Many countries have taken measures to protect the environment in order to limit the negative anthropogenic impacts.In such a situation an objective information on the actual content of pollutants in the atmosphere is of importance. For operational inspection of the pollutant concentrations and for monitoring pollution sources, it is necessary to create high-speed high-sensitivity gas analysers.Laser meters are the most effective to provide operational remote and local inspection of gas pollution of the Earth atmosphere.Laser meter for routine gas analysis should conduct operational analysis of the gas mixture (air. For this a development of appropriate information support is required.Such information support should include a database with absorption coefficients of pollutants (specific to potential sources of pollution at possible measuring wavelengths (holding data for a particular emitter of the laser meter and an efficient algorithms to search the measuring wavelengths and conduct a quantitative analysis of gas mixtures.Currently, the issues, important for practice and related to the development of information support for the laser gas analyzer to conduct important for practice routine measurements remain unclear.In this paper we develop an algorithm to provide an operational search of the measuring wavelengths of laser gas analyser and an algorithm to recover quantitively the gaseous component concentrations of controlled gas mixture from the laser multi-spectral measurements that take into account a priori information about the source-controlled gas pollution and do not require a large amount of computation. The method of mathematical simulation shows the effectiveness of the algorithms described both for seach of measuring wavelengths and for quantitative analysis of gas releases.

  9. Absolute quantitative total-body small-animal SPECT with focusing pinholes

    NARCIS (Netherlands)

    Wu, C.; Van der Have, F.; Vastenhouw, B.; Dierckx, R.A.J.O.; Paans, A.M.J.; Beekman, F.J.

    2010-01-01

    Purpose: In pinhole SPECT, attenuation of the photon flux on trajectories between source and pinholes affects quantitative accuracy of reconstructed images. Previously we introduced iterative methods that compensate for image degrading effects of detector and pinhole blurring, pinhole sensitivity

  10. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  11. A source classification framework supporting pollutant source mapping, pollutant release prediction, transport and load forecasting, and source control planning for urban environments

    DEFF Research Database (Denmark)

    Lützhøft, Hans-Christian Holten; Donner, Erica; Wickman, Tonie

    2012-01-01

    for this purpose. Methods Existing source classification systems were examined by a multidisciplinary research team, and an optimised SCF was developed. The performance and usability of the SCF were tested using a selection of 25 chemicals listed as priority pollutants in Europe. Results The SCF is structured...... in the form of a relational database and incorporates both qualitative and quantitative source classification and release data. The system supports a wide range of pollution monitoring and management applications. The SCF functioned well in the performance test, which also revealed important gaps in priority...

  12. A Study on Conjugate Heat Transfer Analysis of Reactor Vessel including Irradiated Structural Heat Source

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kunwoo; Cho, Hyuksu; Im, Inyoung; Kim, Eunkee [KEPCO EnC, Daejeon (Korea, Republic of)

    2015-10-15

    Though Material reliability programs (MRPs) have a purpose to provide the evaluation or management methodologies for the operating RVI, the similar evaluation methodologies can be applied to the APR1400 fleet in the design stage for the evaluation of neutron irradiation effects. The purposes of this study are: to predict the thermal behavior whether or not irradiated structure heat source; to evaluate effective thermal conductivity (ETC) in relation to isotropic and anisotropic conductivity of porous media for APR1400 Reactor Vessel. The CFD simulations are performed so as to evaluate thermal behavior whether or not irradiated structure heat source and effective thermal conductivity for APR1400 Reactor Vessel. In respective of using irradiated structure heat source, the maximum temperature of fluid and core shroud for isotropic ETC are 325.8 .deg. C, 341.5 .deg. C. The total amount of irradiated structure heat source is about 5.41 MWth and not effect to fluid temperature.

  13. quanTLC, an online open-source solution for videodensitometric quantification.

    Science.gov (United States)

    Fichou, Dimitri; Morlock, Gertrud E

    2018-07-27

    The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  15. Indoor air quality environmental information handbook: Combustion sources

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    This environmental information handbook was prepared to assist both the non-technical reader (i.e., homeowner) and technical persons (such as researchers, policy analysts, and builders/designers) in understanding the current state of knowledge regarding combustion sources of indoor air pollution. Quantitative and descriptive data addressing the emissions, indoor concentrations, factors influencing indoor concentrations, and health effects of combustion-generated pollutants are provided. In addition, a review of the models, controls, and standards applicable to indoor air pollution from combustion sources is presented. The emphasis is on the residential environment. The data presented here have been compiled from government and privately-funded research results, conference proceedings, technical journals, and recent publications. It is intended to provide the technical reader with a comprehensive overview and reference source on the major indoor air quality aspects relating to indoor combustion activities, including tobacco smoking. In addition, techniques for determining potential concentrations of pollutants in residential settings are presented. This is an update of a 1985 study documenting the state of knowledge of combustion-generated pollutants in the indoor environment. 191 refs., 51 figs., 71 tabs.

  16. Optimization of atomic beam sources for polarization experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gaisser, Martin; Nass, Alexander; Stroeher, Hans [IKP, Forschungszentrum Juelich (Germany)

    2012-07-01

    For experiments with spinpolarized protons and neutrons a dense target is required. In current atomic beam sources an atomic hydrogen or deuterium beam is expanded through a cold nozzle and a system of sextupole magnets and RF-transition units selects a certain hyperfine state. The achievable flux seems to be limited to about 10{sup 17} particles per second with a high nuclear polarization. A lot of experimental and theoretical effort has been undertaken to understand all effects and to increase the flux. However, improvements have remained marginal. Now, a Monte Carlo simulation based on the DSMC part of the open source C++ library OpenFOAM is set up in order to get a better understanding of the flow and to optimize the various elements. The goal is to include important effects like deflection from a magnetic field, recombination on the walls and spin exchange collisions in the simulation and make quantitative predictions of changes in the experimental setup. The goal is to get a tool that helps to further increase the output of an atomic beam source.

  17. Calibration of Ge(Li) semiconductor detector by method using agar volume source

    International Nuclear Information System (INIS)

    Yanase, Nobuyuki; Kasai, Atsushi

    1979-12-01

    The Ge(Li) semiconductor detector was calibrated for measurements of environmental samples. The radioisotopes used for standard sources are 22 Na, 51 Cr, 56 Co, 57 Co, 133 Ba, 137 Cs, 144 Ce and 241 Am. These are mixed with hot agar aqueous solution and fixed uniformly in a cylindrical plastic case in cooling. The agar volume source is advantageous in handling over the fluid aqueous source. The prepared cylindrical standard sources are in diameters 6 and 8 cm and thicknesses 1, 5, 10, 20, 30 and 40 mm (only for 8 cm diameter). The radioactivities of prepared standard sources are between 0.03 μCi and 0.2 μCi. It takes only a week to make the calibration except data processing. The obtained full energy peak efficiency curves include 5 - 10% error due to preparation of agar source, reference radioactivity data of purchased standard solutions, reference data of branching ratio of gamma-ray and sum effect. The efficiency curves, however, are sufficient for quantitative analysis of environmental samples. (author)

  18. Sensitivity of a search for cosmic ray sources including magnetic field effects

    Energy Technology Data Exchange (ETDEWEB)

    Urban, Martin; Erdmann, Martin; Mueller, Gero [III. Physikalisches Institut A, RWTH Aachen University (Germany)

    2016-07-01

    We analyze the sensitivity of a new method investigating correlations between ultra-high energy cosmic rays and extragalactic sources taking into account deflections in the galactic magnetic field. In comparisons of expected and simulated arrival directions of cosmic rays we evaluate the directional characteristics and magnitude of the field. We show that our method is capable of detecting anisotropy in data sets with a low signal fraction.

  19. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    Science.gov (United States)

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative analysis of 39 polybrominated diphenyl ethers by isotope dilution GC/low-resolution MS.

    Science.gov (United States)

    Ackerman, Luke K; Wilson, Glenn R; Simonich, Staci L

    2005-04-01

    A GC/low-resolution MS method for the quantitative isotope dilution analysis of 39 mono- to heptabrominated diphenyl ethers was developed. The effects of two different ionization sources, electron impact (EI) and electron capture negative ionization (ECNI), and the effects of their parameters on production of high-mass fragment ions [M - xH - yBr](-) specific to PBDEs were investigated. Electron energy, emission current, source temperature, ECNI system pressure, and choice of ECNI reagent gases were optimized. Previously unidentified enhancement of PBDE high-mass fragment ion [M - xH - yBr](-) abundance was achieved. Electron energy had the largest impact on PBDE high-mass fragment ion abundance for both the ECNI and EI sources. By monitoring high-mass fragment ions of PBDEs under optimized ECNI source conditions, quantitative isotope dilution analysis of 39 PBDEs was conducted using nine (13)C(12) labeled PBDEs on a low-resolution MS with low picogram to femtogram instrument detection limits.

  2. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  3. Spectral confocal reflection microscopy using a white light source

    Science.gov (United States)

    Booth, M.; Juškaitis, R.; Wilson, T.

    2008-08-01

    We present a reflection confocal microscope incorporating a white light supercontinuum source and spectral detection. The microscope provides images resolved spatially in three-dimensions, in addition to spectral resolution covering the wavelength range 450-650nm. Images and reflection spectra of artificial and natural specimens are presented, showing features that are not normally revealed in conventional microscopes or confocal microscopes using discrete line lasers. The specimens include thin film structures on semiconductor chips, iridescent structures in Papilio blumei butterfly scales, nacre from abalone shells and opal gemstones. Quantitative size and refractive index measurements of transparent beads are derived from spectral interference bands.

  4. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  5. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  6. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  7. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  8. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  9. A radiochemical separation of spallogenic 88Zr in the carrier-free state for radioisotopic photoneutron sources

    International Nuclear Information System (INIS)

    Whipple, R.E.; Grant, P.M.; Daniels, R.J.; Daniels, W.R.; O'Brien, H.A.Jr.

    1976-01-01

    As the precursor of its 88 Y daughter, 88 Zr could be advantageously included in the active component of the 88 Y-Be photoneutron source for several reasons. The spallation of Mo targets with medium-energy protons at LAMPF procedure has been developed to separate radiozirconium from the target material and various spallogenic impurities. 88 Zr can consequently be obtained carrier-free and in quantitative yield. (author)

  10. Data for iTRAQ secretomic analysis of Aspergillus fumigatus in response to different carbon sources

    Directory of Open Access Journals (Sweden)

    Sunil S. Adav

    2015-06-01

    Full Text Available Here, we provide data related to the research article entitled “Quantitative proteomics study of Aspergillus fumigatus secretome revealed deamidation of secretory enzymes” by Adav et al. (J. Proteomics (2015 [1]. Aspergillus sp. plays an important role in lignocellulosic biomass recycling. To explore biomass hydrolyzing enzymes of A. fumigatus, we profiled secretome under different carbon sources such as glucose, cellulose, xylan and starch by high throughput quantitative proteomics using isobaric tags for relative and absolute quantification (iTRAQ. The data presented here represents the detailed comparative abundances of diverse groups of biomass hydrolyzing enzymes including cellulases, hemicellulases, lignin degrading enzymes, and peptidases and proteases; and their post translational modification like deamidation.

  11. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  12. Quantitative analysis of biological responses to low dose-rate γ-radiation, including dose, irradiation time, and dose-rate

    International Nuclear Information System (INIS)

    Magae, J.; Furukawa, C.; Kawakami, Y.; Hoshi, Y.; Ogata, H.

    2003-01-01

    Full text: Because biological responses to radiation are complex processes dependent on irradiation time as well as total dose, it is necessary to include dose, dose-rate and irradiation time simultaneously to predict the risk of low dose-rate irradiation. In this study, we analyzed quantitative relationship among dose, irradiation time and dose-rate, using chromosomal breakage and proliferation inhibition of human cells. For evaluation of chromosome breakage we assessed micronuclei induced by radiation. U2OS cells, a human osteosarcoma cell line, were exposed to gamma-ray in irradiation room bearing 50,000 Ci 60 Co. After the irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, cytoplasm and nucleus were stained with DAPI and propidium iodide, and the number of binuclear cells bearing micronuclei was determined by fluorescent microscopy. For proliferation inhibition, cells were cultured for 48 h after the irradiation and [3H] thymidine was pulsed for 4 h before harvesting. Dose-rate in the irradiation room was measured with photoluminescence dosimeter. While irradiation time less than 24 h did not affect dose-response curves for both biological responses, they were remarkably attenuated as exposure time increased to more than 7 days. These biological responses were dependent on dose-rate rather than dose when cells were irradiated for 30 days. Moreover, percentage of micronucleus-forming cells cultured continuously for more than 60 days at the constant dose-rate, was gradually decreased in spite of the total dose accumulation. These results suggest that biological responses at low dose-rate, are remarkably affected by exposure time, that they are dependent on dose-rate rather than total dose in the case of long-term irradiation, and that cells are getting resistant to radiation after the continuous irradiation for 2 months. It is necessary to include effect of irradiation time and dose-rate sufficiently to evaluate risk

  13. Using ensemble models to identify and apportion heavy metal pollution sources in agricultural soils on a local scale

    International Nuclear Information System (INIS)

    Wang, Qi; Xie, Zhiyi; Li, Fangbai

    2015-01-01

    This study aims to identify and apportion multi-source and multi-phase heavy metal pollution from natural and anthropogenic inputs using ensemble models that include stochastic gradient boosting (SGB) and random forest (RF) in agricultural soils on the local scale. The heavy metal pollution sources were quantitatively assessed, and the results illustrated the suitability of the ensemble models for the assessment of multi-source and multi-phase heavy metal pollution in agricultural soils on the local scale. The results of SGB and RF consistently demonstrated that anthropogenic sources contributed the most to the concentrations of Pb and Cd in agricultural soils in the study region and that SGB performed better than RF. - Highlights: • Ensemble models including stochastic gradient boosting and random forest are used. • The models were verified by cross-validation and SGB performed better than RF. • Heavy metal pollution sources on a local scale are identified and apportioned. • Models illustrate good suitability in assessing sources in local-scale agricultural soils. • Anthropogenic sources contributed most to soil Pb and Cd pollution in our case. - Multi-source and multi-phase pollution by heavy metals in agricultural soils on a local scale were identified and apportioned.

  14. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part II: Data Sources from Specific Library Applications

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the second part of a two-part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage in statistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  15. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  16. MSQuant, an Open Source Platform for Mass Spectrometry-Based Quantitative Proteomics

    DEFF Research Database (Denmark)

    Mortensen, Peter; Gouw, Joost W; Olsen, Jesper V

    2010-01-01

    Mass spectrometry-based proteomics critically depends on algorithms for data interpretation. A current bottleneck in the rapid advance of proteomics technology is the closed nature and slow development cycle of vendor-supplied software solutions. We have created an open source software environment...

  17. Quantitative myocardial perfusion by O-15-water PET

    DEFF Research Database (Denmark)

    Thomassen, Anders; Petersen, Henrik; Johansen, Allan

    2015-01-01

    AIMS: Reporting of quantitative myocardial blood flow (MBF) is typically performed in standard coronary territories. However, coronary anatomy and myocardial vascular territories vary among individuals, and a coronary artery may erroneously be deemed stenosed or not if territorial demarcation...... disease (CAD). METHODS AND RESULTS: Forty-four patients with suspected CAD were included prospectively and underwent coronary CT-angiography and quantitative MBF assessment with O-15-water PET followed by invasive, quantitative coronary angiography, which served as reference. MBF was calculated...

  18. Nurse exposure to physical and nonphysical violence, bullying, and sexual harassment: a quantitative review.

    Science.gov (United States)

    Spector, Paul E; Zhou, Zhiqing E; Che, Xin Xuan

    2014-01-01

    This paper provides a quantitative review that estimates exposure rates by type of violence, setting, source, and world region. A quantitative review of the nursing violence literature was summarized. A literature search was conducted using the CINAHL, Medline and PsycInfo data bases. Studies included had to report empirical results using a nursing sample, and include data on bullying, sexual harassment, and/or violence exposure rates. A total of 136 articles provided data on 151,347 nurses from 160 samples. Articles were identified through a database search and by consulting reference lists of review articles that were located. Relevant data were coded by the three authors. Categories depended on the availability of at least five studies. Exposure rates were coded as percentages of nurses in the sample who reported a given type of violence. Five types of violence were physical, nonphysical, bullying, sexual harassment, and combined (type of violence was not indicated). Setting, timeframe, country, and source of violence were coded. Overall violence exposure rates were 36.4% for physical violence, 66.9% for nonphysical violence, 39.7% for bullying, and 25% for sexual harassment, with 32.7% of nurses reporting having been physically injured in an assault. Rates of exposure varied by world region (Anglo, Asia, Europe and Middle East), with the highest rates for physical violence and sexual harassment in the Anglo region, and the highest rates of nonphysical violence and bullying in the Middle East. Regions also varied in the source of violence, with patients accounting for most of it in Anglo and European regions, whereas patents' families/friends were the most common source in the Middle East. About a third of nurses worldwide indicated exposure to physical violence and bullying, about a third reported injury, about a quarter experienced sexual harassment, and about two-thirds indicated nonphysical violence. Physical violence was most prevalent in emergency

  19. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  20. Application of an image processing software for quantitative autoradiography

    International Nuclear Information System (INIS)

    Sobeslavsky, E.; Bergmann, R.; Kretzschmar, M.; Wenzel, U.

    1993-01-01

    The present communication deals with the utilization of an image processing device for quantitative whole-body autoradiography, cell counting and also for interpretation of chromatograms. It is shown that the system parameters allow an adequate and precise determination of optical density values. Also shown are the main error sources limiting the applicability of the system. (orig.)

  1. A simple, semi-quantitative method for measuring pulsed soft x-rays

    International Nuclear Information System (INIS)

    Takahama, Y.; Du, J.; Yanagidaira, T.; Hirano, K.

    1993-01-01

    A simple semi-quantitative measurement and image processing system for pulsed soft X-rays with a time and spatial resolution is proposed. Performance of the system is examined using a cylindrical soft X-ray source generated with a plasma device. The system consists of commercial facilities which are easily obtained such as a microchannel plate-phosphor screen combination, a CCD camera, an image memory board and a personal computer. To make a quantitative measurement possible, the image processing and observation of the phosphor screen current are used in conjunction. (author)

  2. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    Science.gov (United States)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU

  4. Lunar neutron source function

    International Nuclear Information System (INIS)

    Kornblum, J.J.

    1974-01-01

    The search for a quantitative neutron source function for the lunar surface region is justified because it contributes to our understanding of the history of the lunar surface and of nuclear process occurring on the moon since its formation. A knowledge of the neutron source function and neutron flux distribution is important for the interpretation of many experimental measurements. This dissertation uses the available pertinent experimental measurements together with theoretical calculations to obtain an estimate of the lunar neutron source function below 15 MeV. Based upon reasonable assumptions a lunar neutron source function having adjustable parameters is assumed for neutrons below 15 MeV. The lunar neutron source function is composed of several components resulting from the action of cosmic rays with lunar material. A comparison with previous neutron calculations is made and significant differences are discussed. Application of the results to the problem of lunar soil histories is examined using the statistical model for soil development proposed by Fireman. The conclusion is drawn that the moon is losing mass

  5. Using ensemble models to identify and apportion heavy metal pollution sources in agricultural soils on a local scale.

    Science.gov (United States)

    Wang, Qi; Xie, Zhiyi; Li, Fangbai

    2015-11-01

    This study aims to identify and apportion multi-source and multi-phase heavy metal pollution from natural and anthropogenic inputs using ensemble models that include stochastic gradient boosting (SGB) and random forest (RF) in agricultural soils on the local scale. The heavy metal pollution sources were quantitatively assessed, and the results illustrated the suitability of the ensemble models for the assessment of multi-source and multi-phase heavy metal pollution in agricultural soils on the local scale. The results of SGB and RF consistently demonstrated that anthropogenic sources contributed the most to the concentrations of Pb and Cd in agricultural soils in the study region and that SGB performed better than RF. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Data Acceptance Criteria for Standardized Human-Associated Fecal Source Identification Quantitative Real-Time PCR Methods

    Science.gov (United States)

    There is a growing interest in the application of human-associated fecal sourceidentification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data q...

  7. A quantitative approach to the loading rate of seismogenic sources in Italy

    Science.gov (United States)

    Caporali, Alessandro; Braitenberg, Carla; Montone, Paola; Rossi, Giuliana; Valensise, Gianluca; Viganò, Alfio; Zurutuza, Joaquin

    2018-03-01

    To investigate the transfer of elastic energy between a regional stress field and a set of localized faults we project the stress rate tensor inferred from the Italian GNSS velocity field onto faults selected from the Database of Individual Seismogenic Sources (DISS 3.2.0). For given Lamé constants and friction coefficient we compute the loading rate on each fault in terms of the Coulomb Failure Function (CFF) rate. By varying the strike, dip and rake angles around the nominal DISS values, we also estimate the geometry of planes that are optimally oriented for maximal CFF rate. Out of 86 Individual Seismogenic Sources (ISSs), all well covered by GNSS data, 78 to 81 (depending on the assumed friction coefficient) load energy at a rate of 0-4 kPa/yr. The faults displaying larger CFF rates (4 to 6 ± 1 kPa/yr) are located in the central Apennines and are all characterized by a significant strike-slip component. We also find that the loading rate of 75 per cent of the examined sources is less than 1 kPa/yr lower than that of optimally oriented faults. We also analyzed the 24 August and 30 October 2016, central Apennines earthquakes (Mw 6.0-6.5 respectively). The strike of their causative faults based on seismological and tectonic data and the geodetically inferred strike differ by < 30°. Some sources exhibit a strike oblique to the direction of maximum strain rate, suggesting that in some instances the present-day stress acts on inherited faults. The choice of the friction coefficient only marginally affects this result.

  8. A quantitative approach to the loading rate of seismogenic sources in Italy

    Science.gov (United States)

    Caporali, Alessandro; Braitenberg, Carla; Montone, Paola; Rossi, Giuliana; Valensise, Gianluca; Viganò, Alfio; Zurutuza, Joaquin

    2018-06-01

    To investigate the transfer of elastic energy between a regional stress field and a set of localized faults, we project the stress rate tensor inferred from the Italian GNSS (Global Navigation Satellite Systems) velocity field onto faults selected from the Database of Individual Seismogenic Sources (DISS 3.2.0). For given Lamé constants and friction coefficient, we compute the loading rate on each fault in terms of the Coulomb failure function (CFF) rate. By varying the strike, dip and rake angles around the nominal DISS values, we also estimate the geometry of planes that are optimally oriented for maximal CFF rate. Out of 86 Individual Seismogenic Sources (ISSs), all well covered by GNSS data, 78-81 (depending on the assumed friction coefficient) load energy at a rate of 0-4 kPa yr-1. The faults displaying larger CFF rates (4-6 ± 1 kPa yr-1) are located in the central Apennines and are all characterized by a significant strike-slip component. We also find that the loading rate of 75% of the examined sources is less than 1 kPa yr-1 lower than that of optimally oriented faults. We also analysed 2016 August 24 and October 30 central Apennines earthquakes (Mw 6.0-6.5, respectively). The strike of their causative faults based on seismological and tectonic data and the geodetically inferred strike differ by <30°. Some sources exhibit a strike oblique to the direction of maximum strain rate, suggesting that in some instances the present-day stress acts on inherited faults. The choice of the friction coefficient only marginally affects this result.

  9. Physical and Chemical Barriers in Root Tissues Contribute to Quantitative Resistance to Fusarium oxysporum f. sp. pisi in Pea

    Directory of Open Access Journals (Sweden)

    Moustafa Bani

    2018-02-01

    Full Text Available Fusarium wilt caused by Fusarium oxysporum f. sp. pisi (Fop is one of the most destructive diseases of pea worldwide. Control of this disease is difficult and it is mainly based on the use of resistant cultivars. While monogenic resistance has been successfully used in the field, it is at risk of breakdown by the constant evolution of the pathogen. New sources of quantitative resistance have been recently identified from a wild relative Pisum spp. collection. Here, we characterize histologically the resistance mechanisms occurring in these sources of quantitative resistance. Detailed comparison, of the reaction at cellular level, of eight pea accessions with differential responses to Fop race 2, showed that resistant accessions established several barriers at the epidermis, exodermis, cortex, endodermis and vascular stele efficiently impeding fungal progression. The main components of these different barriers were carbohydrates and phenolic compounds including lignin. We found that these barriers were mainly based on three defense mechanisms including cell wall strengthening, formation of papilla-like structures at penetration sites and accumulation of different substances within and between cells. These defense reactions varied in intensity and localization between resistant accessions. Our results also clarify some steps of the infection process of F. oxysporum in plant and support the important role of cell wall-degrading enzymes in F. oxysporum pathogenicity.

  10. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  11. Quantitation of images from a multiwire camera for autoradiography of tritium-labelled substances

    International Nuclear Information System (INIS)

    Lockett, S.J.; Ramsden, D.B.; Bradwell, A.R.

    1987-01-01

    It has been shown that tritium-labelled substances in two-dimensional systems can be quantitated using a multiwire camera. Its accuracy has now been improved by correcting results for non-uniformity of response over the detection area. Uniformity was assessed by imaging plates of nominally uniform activity. The results were then used to correct images from plates containing tritium-labelled proteins using a computer program. Errors were reduced from 11.3 (+ -6.1) to 7.7 (+ - 2.8)% for standard sources and from 6.2 (+ - 1.8) to 1.9 (+ -0.6)% for a plate containing the labelled proteins. The conducting carbon layer covering the plate absorbed 36 (+ - 3)% of the tritium beta radiation and was estimated to be 85 nm in thickness. Quantitation of the labelled proteins by the camera gave a good correlation with protein content (chi-squared: 30-40%). The activities of the protein samples were measured to an accuracy of 10% by comparison with standard sources. These results indicate useful quantitation of tritiated compounds in two-dimensional media using the multiwire camera. (author)

  12. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  13. Urban PM in Eastern Germany: Source apportionment and contributions from different spatial scales

    Science.gov (United States)

    van Pinxteren, D.; Fomba, K. W.; Mothes, F.; Spindler, G.; Herrmann, H.

    2017-12-01

    Understanding the contributions of particulate matter (PM) sources and the source areas impacting total PM levels in a city are important requirements for further developing clean air policies and efficient abatement strategies. This presentation reports on two studies in Eastern Germany providing a detailed picture of present-day urban PM sources and discriminating contributions of local, regional and long-range sources. The "Leipzig Aerosol 2013-15" study yielded contributions of 12 sources to coarse, fine, and ultrafine particles, resolved by Positive Matrix Factorization (PMF) from comprehensive chemical speciation of 5-stage Berner impactor samples at 4 different sites in the Leipzig area. Dominant winter-time sources were traffic exhaust and non-exhaust emissions, secondary aerosol formation, and combustion emissions from both biomass and coal burning with different relative importance in different particle size ranges. Local sources dominated PM levels in ultrafine and coarse particles (60% - 80%) while high mass concentrations in accumulation mode particles mainly resulted from regional import into the city (70%). The "PM-East" study compiled PM10 mass and constituents' concentrations at 10 urban and rural sites in Eastern Germany during winter 2016/17, which included a 3-week episode of frequent exceedances of the PM10 limit value. PMF source apportionment is performed for a subset of the sites, including the city of Berlin. Contributions from short-, mid-, and long-range sources, including trans-boundary pollution import from neighbouring countries, are quantitatively assessed by advanced back trajectory statistical methods. Data analysis in PM-East is ongoing and final results will be available by November. Funding is acknowledged from 4 federal states of Germany: Berlin Senate Department for Environment, Transport and Climate Protection; Saxon State Office for Environment, Agriculture and Geology; State Agency for Environment, Nature Conservation and

  14. Classification of quantitative light-induced fluorescence images using convolutional neural network

    NARCIS (Netherlands)

    Imangaliyev, S.; van der Veen, M.H.; Volgenant, C.M.C.; Loos, B.G.; Keijser, B.J.F.; Crielaard, W.; Levin, E.; Lintas, A.; Rovetta, S.; Verschure, P.F.M.J.; Villa, A.E.P.

    2017-01-01

    Images are an important data source for diagnosis of oral diseases. The manual classification of images may lead to suboptimal treatment procedures due to subjective errors. In this paper an image classification algorithm based on Deep Learning framework is applied to Quantitative Light-induced

  15. In vivo quantitative imaging of point-like bioluminescent and fluorescent sources: Validation studies in phantoms and small animals post mortem

    Science.gov (United States)

    Comsa, Daria Craita

    2008-10-01

    There is a real need for improved small animal imaging techniques to enhance the development of therapies in which animal models of disease are used. Optical methods for imaging have been extensively studied in recent years, due to their high sensitivity and specificity. Methods like bioluminescence and fluorescence tomography report promising results for 3D reconstructions of source distributions in vivo. However, no standard methodology exists for optical tomography, and various groups are pursuing different approaches. In a number of studies on small animals, the bioluminescent or fluorescent sources can be reasonably approximated as point or line sources. Examples include images of bone metastases confined to the bone marrow. Starting with this premise, we propose a simpler, faster, and inexpensive technique to quantify optical images of point-like sources. The technique avoids the computational burden of a tomographic method by using planar images and a mathematical model based on diffusion theory. The model employs in situ optical properties estimated from video reflectometry measurements. Modeled and measured images are compared iteratively using a Levenberg-Marquardt algorithm to improve estimates of the depth and strength of the bioluminescent or fluorescent inclusion. The performance of the technique to quantify bioluminescence images was first evaluated on Monte Carlo simulated data. Simulated data also facilitated a methodical investigation of the effect of errors in tissue optical properties on the retrieved source depth and strength. It was found that, for example, an error of 4 % in the effective attenuation coefficient led to 4 % error in the retrieved depth for source depths of up to 12mm, while the error in the retrieved source strength increased from 5.5 % at 2mm depth, to 18 % at 12mm depth. Experiments conducted on images from homogeneous tissue-simulating phantoms showed that depths up to 10mm could be estimated within 8 %, and the relative

  16. Design database for quantitative trait loci (QTL) data warehouse, data mining, and meta-analysis.

    Science.gov (United States)

    Hu, Zhi-Liang; Reecy, James M; Wu, Xiao-Lin

    2012-01-01

    A database can be used to warehouse quantitative trait loci (QTL) data from multiple sources for comparison, genomic data mining, and meta-analysis. A robust database design involves sound data structure logistics, meaningful data transformations, normalization, and proper user interface designs. This chapter starts with a brief review of relational database basics and concentrates on issues associated with curation of QTL data into a relational database, with emphasis on the principles of data normalization and structure optimization. In addition, some simple examples of QTL data mining and meta-analysis are included. These examples are provided to help readers better understand the potential and importance of sound database design.

  17. Source composition of cosmic rays at high energy

    International Nuclear Information System (INIS)

    Juliusson, E.; Cesarsky, C.J.; Meneguzzi, M.; Casse, M.

    1975-01-01

    The source composition of the cosmic ray is usually calculated at an energy of a few GeV per nucleon. Recent measurements have however indicated that the source composition may be energy dependent. In order to give a quantitative answer to this question the source composition at 50GeV/nucleon has been calculated using an exponential distribution of path lengths and in the slab approximation. The results obtained at high energy agree very well with the source composition obtained at lower energies, except the abundance of carbon which is significantly lower than the generally accepted value of low energies [fr

  18. Spectrally and Radiometrically Stable, Wideband, Onboard Calibration Source

    Science.gov (United States)

    Coles, James B.; Richardson, Brandon S.; Eastwood, Michael L.; Sarture, Charles M.; Quetin, Gregory R.; Porter, Michael D.; Green, Robert O.; Nolte, Scott H.; Hernandez, Marco A.; Knoll, Linley A.

    2013-01-01

    The Onboard Calibration (OBC) source incorporates a medical/scientific-grade halogen source with a precisely designed fiber coupling system, and a fiber-based intensity-monitoring feedback loop that results in radiometric and spectral stabilities to within less than 0.3 percent over a 15-hour period. The airborne imaging spectrometer systems developed at the Jet Propulsion Laboratory incorporate OBC sources to provide auxiliary in-use system calibration data. The use of the OBC source will provide a significant increase in the quantitative accuracy, reliability, and resulting utility of the spectral data collected from current and future imaging spectrometer instruments.

  19. Geologic sources and concentrations of selenium in the West-Central Denver Basin, including the Toll Gate Creek watershed, Aurora, Colorado, 2003-2007

    Science.gov (United States)

    Paschke, Suzanne S.; Walton-Day, Katherine; Beck, Jennifer A.; Webbers, Ank; Dupree, Jean A.

    2014-01-01

    Toll Gate Creek, in the west-central part of the Denver Basin, is a perennial stream in which concentrations of dissolved selenium have consistently exceeded the Colorado aquatic-life standard of 4.6 micrograms per liter. Recent studies of selenium in Toll Gate Creek identified the Denver lignite zone of the non-marine Cretaceous to Tertiary-aged (Paleocene) Denver Formation underlying the watershed as the geologic source of dissolved selenium to shallow ground-water and surface water. Previous work led to this study by the U.S. Geological Survey, in cooperation with the City of Aurora Utilities Department, which investigated geologic sources of selenium and selenium concentrations in the watershed. This report documents the occurrence of selenium-bearing rocks and groundwater within the Cretaceous- to Tertiary-aged Denver Formation in the west-central part of the Denver Basin, including the Toll Gate Creek watershed. The report presents background information on geochemical processes controlling selenium concentrations in the aquatic environment and possible geologic sources of selenium; the hydrogeologic setting of the watershed; selenium results from groundwater-sampling programs; and chemical analyses of solids samples as evidence that weathering of the Denver Formation is a geologic source of selenium to groundwater and surface water in the west-central part of the Denver Basin, including Toll Gate Creek. Analyses of water samples collected from 61 water-table wells in 2003 and from 19 water-table wells in 2007 indicate dissolved selenium concentrations in groundwater in the west-central Denver Basin frequently exceeded the Colorado aquatic-life standard and in some locations exceeded the primary drinking-water standard of 50 micrograms per liter. The greatest selenium concentrations were associated with oxidized groundwater samples from wells completed in bedrock materials. Selenium analysis of geologic core samples indicates that total selenium

  20. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  1. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  2. Quantitative amd Qualitative Sources of Affect: How Unexpectedness and Valence Relate to Pleasantness and Preference. Technical Report No. 293.

    Science.gov (United States)

    Iran-Nejad, Asghar; Ortony, Andrew

    Optimal-level theories maintain that the quality of affect is a function of a quantitative arousal potential dimension. An alternative view is that the quantitative dimension merely modulates preexisting qualitative properties and is therefore only responsible for changes in the degree of affect. Thus, the quality of affect, whether it is positive…

  3. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  4. Theoretical response of a ZnS(Ag) scintillation detector to alpha-emitting sources and suggested applications

    International Nuclear Information System (INIS)

    Skrable, K.W.; Phoenix, K.A.; Chabot, G.E.; French, C.S.; Jo, M.; Falo, G.A.

    1991-01-01

    The classic problem of alpha absorption is discussed in terms of the quantitative determination of the activity of weightless alpha sources and the specific alpha activity of extended sources accounting for absorption in the source medium and the window of a large area ZnS(Ag) scintillation detector. The relationship for the expected counting rate gamma of a monoenergetic source of active area A, specific alpha activity C, and thickness H that exceeds the effective mass density range Rs of the alpha particle in the source medium can be expressed by a quadratic equation in the window thickness x when this source is placed in direct contact with the window of the ZnS(Ag) detector. This expression also gives the expected counting rate of a finite detector of sensitive area A exposed to an infinite homogeneous source medium. Counting rates y obtained for a source separated from a ZnS(Ag) detector by different thicknesses x of window material can be used to estimate parameter values in the quadratic equation, y = a + bx + cx2. The experimental value determined for the coefficient b provides a direct estimation of the specific activity C. This coefficient, which depends on the ratio of the ranges in the source medium and detector window and not the ranges themselves, is essentially independent of the energy of the alpha particle. Although certain experimental precautions must be taken, this method for estimating the specific activity C is essentially an absolute method that does not require the use of standards, special calibrations, or complicated radiochemical procedures. Applications include the quantitative determination of Rn and progeny in air, water, and charcoal, and the measurement of the alpha activity in soil and on air filter samples

  5. Quantitative breast tissue characterization using grating-based x-ray phase-contrast imaging

    Science.gov (United States)

    Willner, M.; Herzen, J.; Grandl, S.; Auweter, S.; Mayr, D.; Hipp, A.; Chabior, M.; Sarapata, A.; Achterhold, K.; Zanette, I.; Weitkamp, T.; Sztrókay, A.; Hellerhoff, K.; Reiser, M.; Pfeiffer, F.

    2014-04-01

    X-ray phase-contrast imaging has received growing interest in recent years due to its high capability in visualizing soft tissue. Breast imaging became the focus of particular attention as it is considered the most promising candidate for a first clinical application of this contrast modality. In this study, we investigate quantitative breast tissue characterization using grating-based phase-contrast computed tomography (CT) at conventional polychromatic x-ray sources. Different breast specimens have been scanned at a laboratory phase-contrast imaging setup and were correlated to histopathology. Ascertained tumor types include phylloides tumor, fibroadenoma and infiltrating lobular carcinoma. Identified tissue types comprising adipose, fibroglandular and tumor tissue have been analyzed in terms of phase-contrast Hounsfield units and are compared to high-quality, high-resolution data obtained with monochromatic synchrotron radiation, as well as calculated values based on tabulated tissue properties. The results give a good impression of the method’s prospects and limitations for potential tumor detection and the associated demands on such a phase-contrast breast CT system. Furthermore, the evaluated quantitative tissue values serve as a reference for simulations and the design of dedicated phantoms for phase-contrast mammography.

  6. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  7. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  8. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  9. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  10. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Monitoring alert and drowsy states by modeling EEG source nonstationarity

    Science.gov (United States)

    Hsu, Sheng-Hsiou; Jung, Tzyy-Ping

    2017-10-01

    Objective. As a human brain performs various cognitive functions within ever-changing environments, states of the brain characterized by recorded brain activities such as electroencephalogram (EEG) are inevitably nonstationary. The challenges of analyzing the nonstationary EEG signals include finding neurocognitive sources that underlie different brain states and using EEG data to quantitatively assess the state changes. Approach. This study hypothesizes that brain activities under different states, e.g. levels of alertness, can be modeled as distinct compositions of statistically independent sources using independent component analysis (ICA). This study presents a framework to quantitatively assess the EEG source nonstationarity and estimate levels of alertness. The framework was tested against EEG data collected from 10 subjects performing a sustained-attention task in a driving simulator. Main results. Empirical results illustrate that EEG signals under alert versus drowsy states, indexed by reaction speeds to driving challenges, can be characterized by distinct ICA models. By quantifying the goodness-of-fit of each ICA model to the EEG data using the model deviation index (MDI), we found that MDIs were significantly correlated with the reaction speeds (r  =  -0.390 with alertness models and r  =  0.449 with drowsiness models) and the opposite correlations indicated that the two models accounted for sources in the alert and drowsy states, respectively. Based on the observed source nonstationarity, this study also proposes an online framework using a subject-specific ICA model trained with an initial (alert) state to track the level of alertness. For classification of alert against drowsy states, the proposed online framework achieved an averaged area-under-curve of 0.745 and compared favorably with a classic power-based approach. Significance. This ICA-based framework provides a new way to study changes of brain states and can be applied to

  12. THE CHANDRA SOURCE CATALOG

    International Nuclear Information System (INIS)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Harbo, Peter N.; He Xiangqun; Karovska, Margarita; Kashyap, Vinay L.; Davis, John E.; Houck, John C.; Hall, Diane M.

    2010-01-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents ∼<30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of ∼<1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  13. The Chandra Source Catalog

    Science.gov (United States)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2010-07-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  14. Exploratory analysis of a neutron-rich nuclei source based on photo-fission

    CERN Document Server

    Mirea, M; Clapier, F; Essabaa, S; Groza, L; Ibrahim, F; Kandri-Rody, S; Müller, A C; Pauwels, N; Proust, J

    2003-01-01

    A source of neutron rich ions can be conceived through the photo-fission process. An exploratory study of such a source is realized. A survey of the radiative electron energy loss theory is reported in order to estimate numerically the bremsstrahlung production of thick targets. The resulted bremsstrahlung angular and energy theoretical distributions delivered from W and UCx thick converters are presented and compared with previous results. Some quantities as the number of fission events produced in the fissionable source and the energy loss in the converters are also reported as function of the geometry of the combination and the incident electron energy. An attempt of comparison with experimental data shows a quantitative agreement. This study is focussed on initial kinetic energies of the electron beam included in the range 30-60 MeV, suitable for the production of large radiative gamma-ray yields able to induce the $^{238}$U fission through the giant dipole resonance. A confrontation with the number of fi...

  15. Photon statistics characterization of a single-photon source

    International Nuclear Information System (INIS)

    Alleaume, R; Treussart, F; Courty, J-M; Roch, J-F

    2004-01-01

    In a recent experiment, we reported the time-domain intensity noise measurement of a single-photon source relying on single-molecule fluorescence control. In this paper, we present data processing starting from photocount timestamps. The theoretical analytical expression of the time-dependent Mandel parameter Q(T) of an intermittent single-photon source is derived from ON↔OFF dynamics. Finally, source intensity noise analysis, using the Mandel parameter, is quantitatively compared with the usual approach relying on the time autocorrelation function, both methods yielding the same molecular dynamical parameters

  16. Mapping Protein-Protein Interactions by Quantitative Proteomics

    DEFF Research Database (Denmark)

    Dengjel, Joern; Kratchmarova, Irina; Blagoev, Blagoy

    2010-01-01

    spectrometry (MS)-based proteomics in combination with affinity purification protocols has become the method of choice to map and track the dynamic changes in protein-protein interactions, including the ones occurring during cellular signaling events. Different quantitative MS strategies have been used...... to characterize protein interaction networks. In this chapter we describe in detail the use of stable isotope labeling by amino acids in cell culture (SILAC) for the quantitative analysis of stimulus-dependent dynamic protein interactions.......Proteins exert their function inside a cell generally in multiprotein complexes. These complexes are highly dynamic structures changing their composition over time and cell state. The same protein may thereby fulfill different functions depending on its binding partners. Quantitative mass...

  17. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  18. A source term and risk calculations using level 2+PSA methodology

    International Nuclear Information System (INIS)

    Park, S. I.; Jea, M. S.; Jeon, K. D.

    2002-01-01

    The scope of Level 2+ PSA includes the assessment of dose risk which is associated with the exposures of the radioactive nuclides escaping from nuclear power plants during severe accidents. The establishment of data base for the exposure dose in Korea nuclear power plants may contribute to preparing the accident management programs and periodic safety reviews. In this study the ORIGEN, MELCOR and MACCS code were employed to produce a integrated framework to assess the radiation source term risk. The framework was applied to a reference plant. Using IPE results, the dose rate for the reference plant was calculated quantitatively

  19. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging

  20. Source term reduction at DAEC (including stellite ball recycling)

    International Nuclear Information System (INIS)

    Smith, R.; Schebler, D.

    1995-01-01

    The Duane Arnold Energy Center was seeking methods to reduce dose rates from the drywell due to Co-60. Duane Arnold is known in the industry to have one of the highest drywell dose rates from the industry standardized 'BRAC' point survey. A prime method to reduce dose rates due to Co-60 is the accelerated replacement of stellite pins and rollers in control rod blades due to their high stellite (cobalt) content. Usually the cobalt content in alloys of stellite is greater than 60% cobalt by weight. During the RFO-12 refueling outage at Duane Arnold, all of the remaining cobalt bearing control rod blades were replaced and new stellite free control rod blades were installed in the core. This left Duane Arnold with the disposal of highly radioactive stellite pins and rollers. The processing of control rod blades for disposal is a very difficult evolution. First, the velocity limiter (a bottom portion of the component) and the highly radioactive upper stellite control rod blade ins and rollers are separated from the control rod blade. Next, the remainder of the control rod blade is processed (chopped and/or crushed) to aid packaging the waste for disposal. The stellite bearings are then often carefully placed in with the rest of the waste in a burial liner to provide shielding for disposal or more often are left as 'orphans' in the spent fuel pool because their high specific activity create shipping and packaging problems. Further investigation by the utility showed that the stellite balls and pins could be recycled to a source manufacturer rather than disposed of in a low-level burial site. The cost savings to the utility was on the order of $200,000 with a gross savings of $400,000 in savings in burial site charges. A second advantage of the recycling of the stellite pins and rollers was a reduction in control in radioactive waste shipments

  1. Towards a Quantitative Framework for Evaluating Vulnerability of Drinking Water Wells to Contamination from Unconventional Oil & Gas Development

    Science.gov (United States)

    Soriano, M., Jr.; Deziel, N. C.; Saiers, J. E.

    2017-12-01

    The rapid expansion of unconventional oil and gas (UO&G) production, made possible by advances in hydraulic fracturing (fracking), has triggered concerns over risks this extraction poses to water resources and public health. Concerns are particularly acute within communities that host UO&G development and rely heavily on shallow aquifers as sources of drinking water. This research aims to develop a quantitative framework to evaluate the vulnerability of drinking water wells to contamination from UO&G activities. The concept of well vulnerability is explored through application of backwards travel time probability modeling to estimate the likelihood that capture zones of drinking water wells circumscribe source locations of UO&G contamination. Sources of UO&G contamination considered in this analysis include gas well pads and documented sites of UO&G wastewater and chemical spills. The modeling approach is illustrated for a portion of Susquehanna County, Pennsylvania, where more than one thousand shale gas wells have been completed since 2005. Data from a network of eight multi-level groundwater monitoring wells installed in the study site in 2015 are used to evaluate the model. The well vulnerability concept is proposed as a physically based quantitative tool for policy-makers dealing with the management of contamination risks of drinking water wells. In particular, the model can be used to identify adequate setback distances of UO&G activities from drinking water wells and other critical receptors.

  2. Quantitative analysis of polyethylene glycol (PEG) and PEGylated proteins in animal tissues by LC-MS/MS coupled with in-source CID.

    Science.gov (United States)

    Gong, Jiachang; Gu, Xiaomei; Achanzar, William E; Chadwick, Kristina D; Gan, Jinping; Brock, Barry J; Kishnani, Narendra S; Humphreys, W Griff; Iyer, Ramaswamy A

    2014-08-05

    The covalent conjugation of polyethylene glycol (PEG, typical MW > 10k) to therapeutic peptides and proteins is a well-established approach to improve their pharmacokinetic properties and diminish the potential for immunogenicity. Even though PEG is generally considered biologically inert and safe in animals and humans, the slow clearance of large PEGs raises concerns about potential adverse effects resulting from PEG accumulation in tissues following chronic administration, particularly in the central nervous system. The key information relevant to the issue is the disposition and fate of the PEG moiety after repeated dosing with PEGylated proteins. Here, we report a novel quantitative method utilizing LC-MS/MS coupled with in-source CID that is highly selective and sensitive to PEG-related materials. Both (40K)PEG and a tool PEGylated protein (ATI-1072) underwent dissociation in the ionization source of mass spectrometer to generate a series of PEG-specific ions, which were subjected to further dissociation through conventional CID. To demonstrate the potential application of the method to assess PEG biodistribution following PEGylated protein administration, a single dose study of ATI-1072 was conducted in rats. Plasma and various tissues were collected, and the concentrations of both (40K)PEG and ATI-1072 were determined using the LC-MS/MS method. The presence of (40k)PEG in plasma and tissue homogenates suggests the degradation of PEGylated proteins after dose administration to rats, given that free PEG was absent in the dosing solution. The method enables further studies for a thorough characterization of disposition and fate of PEGylated proteins.

  3. The Case for Infusing Quantitative Literacy into Introductory Geoscience Courses

    Directory of Open Access Journals (Sweden)

    Jennifer M. Wenner

    2009-01-01

    Full Text Available We present the case for introductory geoscience courses as model venues for increasing the quantitative literacy (QL of large numbers of the college-educated population. The geosciences provide meaningful context for a number of fundamental mathematical concepts that are revisited several times in a single course. Using some best practices from the mathematics education community surrounding problem solving, calculus reform, pre-college mathematics and five geoscience/math workshops, geoscience and mathematics faculty have identified five pedagogical ideas to increase the QL of the students who populate introductory geoscience courses. These five ideas include techniques such as: place mathematical concepts in context, use multiple representations, use technology appropriately, work in groups, and do multiple-day, in-depth problems that place quantitative skills in multiple contexts. We discuss the pedagogical underpinnings of these five ideas and illustrate some ways that the geosciences represent ideal places to use these techniques. However, the inclusion of QL in introductory courses is often met with resistance at all levels. Faculty who wish to include quantitative content must use creative means to break down barriers of public perception of geoscience as qualitative, administrative worry that enrollments will drop and faculty resistance to change. Novel ways to infuse QL into geoscience classrooms include use of web-based resources, shadow courses, setting clear expectations, and promoting quantitative geoscience to the general public. In order to help faculty increase the QL of geoscience students, a community-built faculty-centered web resource (Teaching Quantitative Skills in the Geosciences houses multiple examples that implement the five best practices of QL throughout the geoscience curriculum. We direct faculty to three portions of the web resource: Teaching Quantitative Literacy, QL activities, and the 2006 workshop website

  4. Hard-x-ray phase-difference microscopy with a low-brilliance laboratory x-ray source

    International Nuclear Information System (INIS)

    Kuwabara, Hiroaki; Yashiro, Wataru; Harasse, Sebastien; Momose, Atsushi; Mizutani, Haruo

    2011-01-01

    We have developed a hard-X-ray phase-imaging microscopy method using a low-brilliance X-ray source. The microscope consists of a sample, a Fresnel zone plate, a transmission grating, and a source grating creating an array of mutually incoherent X-ray sources. The microscope generates an image exhibiting twin features of the sample with opposite signs separated by a distance, which is processed to generate a phase image. The method is quantitative even for non-weak-phase objects that are difficult to be quantitatively examined by the widely used Zernike phase-contrast microscopy, and it has potentially broad applications in the material and biological science fields. (author)

  5. Analysis of Paralleling Limited Capacity Voltage Sources by Projective Geometry Method

    Directory of Open Access Journals (Sweden)

    Alexandr Penin

    2014-01-01

    Full Text Available The droop current-sharing method for voltage sources of a limited capacity is considered. Influence of equalizing resistors and load resistor is investigated on uniform distribution of relative values of currents when the actual loading corresponds to the capacity of a concrete source. Novel concepts for quantitative representation of operating regimes of sources are entered with use of projective geometry method.

  6. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  7. QTest: Quantitative Testing of Theories of Binary Choice

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  8. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  9. Quantitative fuel motion determination with the CABRI fast neutron hodoscope

    International Nuclear Information System (INIS)

    Baumung, K.; Augier, G.

    1991-01-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized

  10. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    Science.gov (United States)

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  11. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Helmus, Jonathan J., E-mail: jjhelmus@gmail.com [Argonne National Laboratory, Environmental Science Division (United States); Jaroniec, Christopher P., E-mail: jaroniec@chemistry.ohio-state.edu [Ohio State University, Department of Chemistry and Biochemistry (United States)

    2013-04-15

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  12. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    International Nuclear Information System (INIS)

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  13. Designing a Quantitative Structure-Activity Relationship for the ...

    Science.gov (United States)

    Toxicokinetic models serve a vital role in risk assessment by bridging the gap between chemical exposure and potentially toxic endpoints. While intrinsic metabolic clearance rates have a strong impact on toxicokinetics, limited data is available for environmentally relevant chemicals including nearly 8000 chemicals tested for in vitro bioactivity in the Tox21 program. To address this gap, a quantitative structure-activity relationship (QSAR) for intrinsic metabolic clearance rate was developed to offer reliable in silico predictions for a diverse array of chemicals. Models were constructed with curated in vitro assay data for both pharmaceutical-like chemicals (ChEMBL database) and environmentally relevant chemicals (ToxCast screening) from human liver microsomes (2176 from ChEMBL) and human hepatocytes (757 from ChEMBL and 332 from ToxCast). Due to variability in the experimental data, a binned approach was utilized to classify metabolic rates. Machine learning algorithms, such as random forest and k-nearest neighbor, were coupled with open source molecular descriptors and fingerprints to provide reasonable estimates of intrinsic metabolic clearance rates. Applicability domains defined the optimal chemical space for predictions, which covered environmental chemicals well. A reduced set of informative descriptors (including relative charge and lipophilicity) and a mixed training set of pharmaceuticals and environmentally relevant chemicals provided the best intr

  14. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  15. Microfocus x-ray imaging of traceable pointlike {sup 22}Na sources for quality control

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, T.; Oda, K.; Sato, Y.; Ito, H.; Masuda, S.; Yamada, T.; Matsumoto, M.; Murayama, H.; Takei, H. [Allied Health Sciences, Kitasato University Kitasato 1-15-1, Minami-ku, Sagamihara-shi, Kanagawa 252-0373 (Japan); Positron Medical Center, Tokyo Metropolitan Institute of Gerontology Sakaecho 35-2, Itabashi-ku, Tokyo 173-0015 (Japan); Advanced Industrial Science and Technology (AIST) Central 2, Umezono 1-1-1, Tsukuba-shi, Ibaraki 305-8568 (Japan); Kanagawa Industrial Technology Center (KITC) Shimoimazumi 705-1, Ebina-shi, Kanagawa 243-0435 (Japan); Japan Radioisotope Association (JRIA) Komagome 2-28-45, Bunkyo-ku, Tokyo 113-8941 (Japan); Molecular Imaging Center, National Institute of Radiological Sciences Anagawa 4-9-1, Inage, Chiba 263-8555 (Japan); Graduate School of Medical Sciences, Kitasato University Kitasato 1-15-1, Minami-ku, Sagamihara-shi, Kanagawa 252-0373 (Japan)

    2012-07-15

    Purpose: The purpose of this study is to propose a microfocus x-ray imaging technique for observing the internal structure of small radioactive sources and evaluating geometrical errors quantitatively, and to apply this technique to traceable pointlike {sup 22}Na sources, which were designed for positron emission tomography calibration, for the purpose of quality control of the pointlike sources. Methods: A microfocus x-ray imaging system with a focus size of 0.001 mm was used to obtain projection x-ray images and x-ray CT images of five pointlike source samples, which were manufactured during 2009-2012. The obtained projection and tomographic images were used to observe the internal structure and evaluate geometrical errors quantitatively. Monte Carlo simulation was used to evaluate the effect of possible geometrical errors on the intensity and uniformity of 0.511 MeV annihilation photon pairs emitted from the sources. Results: Geometrical errors were evaluated with sufficient precision using projection x-ray images. CT images were used for observing the internal structure intuitively. As a result, four of the five examined samples were within the tolerance to maintain the total uncertainty below {+-}0.5%, given the source radioactivity; however, one sample was found to be defective. Conclusions: This quality control procedure is crucial and offers an important basis for using the pointlike {sup 22}Na source as a basic calibration tool. The microfocus x-ray imaging approach is a promising technique for visual and quantitative evaluation of the internal geometry of small radioactive sources.

  16. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  17. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  18. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  19. Isotopic and molecular fractionation in combustion; three routes to molecular marker validation, including direct molecular 'dating' (GC/AMS)

    Science.gov (United States)

    Currie, L. A.; Klouda, G. A.; Benner, B. A.; Garrity, K.; Eglinton, T. I.

    The identification of unique isotopic, elemental, and molecular markers for sources of combustion aerosol has growing practical importance because of the potential effects of fine particle aerosol on health, visibility and global climate. It is urgent, therefore, that substantial efforts be directed toward the validation of assumptions involving the use of such tracers for source apportionment. We describe here three independent routes toward carbonaceous aerosol molecular marker identification and validation: (1) tracer regression and multivariate statistical techniques applied to field measurements of mixed source, carbonaceous aerosols; (2) a new development in aerosol 14C metrology: direct, pure compound accelerator mass spectrometry (AMS) by off-line GC/AMS ('molecular dating'); and (3) direct observation of isotopic and molecular source emissions during controlled laboratory combustion of specific fuels. Findings from the combined studies include: independent support for benzo( ghi)perylene as a motor vehicle tracer from the first (statistical) and second (direct 'dating') studies; a new indication, from the third (controlled combustion) study, of a relation between 13C isotopic fractionation and PAH molecular fractionation, also linked with fuel and stage of combustion; and quantitative data showing the influence of both fuel type and combustion conditions on the yields of such species as elemental carbon and PAH, reinforcing the importance of exercising caution when applying presumed conservative elemental or organic tracers to fossil or biomass burning field data as in the first study.

  20. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  1. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin

    2016-04-27

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  2. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin; Schorlemmer, Danijel; Page, Morgan; Ampuero, Jean‐Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Kä ser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran Kumar; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish Chandra; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  3. A custom-built PET phantom design for quantitative imaging of printed distributions

    International Nuclear Information System (INIS)

    Markiewicz, P J; Angelis, G I; Kotasidis, F; Green, M; Matthews, J C; Lionheart, W R; Reader, A J

    2011-01-01

    This note presents a practical approach to a custom-made design of PET phantoms enabling the use of digital radioactive distributions with high quantitative accuracy and spatial resolution. The phantom design allows planar sources of any radioactivity distribution to be imaged in transaxial and axial (sagittal or coronal) planes. Although the design presented here is specially adapted to the high-resolution research tomograph (HRRT), the presented methods can be adapted to almost any PET scanner. Although the presented phantom design has many advantages, a number of practical issues had to be overcome such as positioning of the printed source, calibration, uniformity and reproducibility of printing. A well counter (WC) was used in the calibration procedure to find the nonlinear relationship between digital voxel intensities and the actual measured radioactive concentrations. Repeated printing together with WC measurements and computed radiography (CR) using phosphor imaging plates (IP) were used to evaluate the reproducibility and uniformity of such printing. Results show satisfactory printing uniformity and reproducibility; however, calibration is dependent on the printing mode and the physical state of the cartridge. As a demonstration of the utility of using printed phantoms, the image resolution and quantitative accuracy of reconstructed HRRT images are assessed. There is very good quantitative agreement in the calibration procedure between HRRT, CR and WC measurements. However, the high resolution of CR and its quantitative accuracy supported by WC measurements made it possible to show the degraded resolution of HRRT brain images caused by the partial-volume effect and the limits of iterative image reconstruction. (note)

  4. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  5. Quantitative Psychology Research : The 80th Annual Meeting of the Psychometric Society, Beijing, 2015

    NARCIS (Netherlands)

    van der Ark, L.A.; Bolt, D.M.; Wang, W.-C.; Douglas, J.A.; Wiberg, M.

    2016-01-01

    The research articles in this volume cover timely quantitative psychology topics, including new methods in item response theory, computerized adaptive testing, cognitive diagnostic modeling, and psychological scaling. Topics within general quantitative methodology include structural equation

  6. BBN based Quantitative Assessment of Software Design Specification

    International Nuclear Information System (INIS)

    Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol

    2007-01-01

    Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model

  7. Quantitative schemes in energy dispersive X-ray fluorescence implemented in AXIL

    International Nuclear Information System (INIS)

    Tchantchane, A.; Benamar, M.A.; Tobbeche, S.

    1995-01-01

    E.D.X.R.F (Energy Dispersive X-ray Fluorescence) has long been used for quantitative analysis of many types of samples including environment samples. the software package AXIL (Analysis of x-ray spectra by iterative least quares) is extensively used for the spectra analysis and the quantification of x-ray spectra. It includes several methods of quantitative schemes for evaluating element concentrations. We present the general theory behind each scheme implemented into the software package. The spectra of the performance of each of these quantitative schemes. We have also investigated their performance relative to the uncertainties in the experimental parameters and sample description

  8. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  9. Radiation-based quantitative bioimaging at the national institute of standards and technology

    Directory of Open Access Journals (Sweden)

    Karam Lisa

    2009-01-01

    Full Text Available Building on a long history of providing physical measurements and standards for medical x rays and nuclear medicine radionuclides, the laboratory has expanded its focus to better support the extensive use of medical physics in the United States today, providing confidence in key results needed for drug and device development and marketing, therapy planning and efficacy and disease screening. In particular, to support more quantitative medical imaging, this laboratory has implemented a program to provide key measurement infrastructure to support radiation-based imaging through developing standard, benchmark phantoms, which contain radioactive sources calibrated to national measurement standards, to allow more quantitative imaging through traceable instrument calibration for clinical trials or patient management. Working closely with colleagues at the National Institutes of Health, Rensselaer Polytechnic Institute, the Food and Drug Administration and Cornell University, this laboratory has taken the initial steps in developing phantoms, and the protocols to use them, for more accurate calibration of positron emission tomography (PET or single-photon emission computed tomography (SPECT cameras, including recently standardizing 68 Ge. X-ray measurements of the laboratory′s recently developed small, resilient and inexpensive length standard phantom have shown the potential usefulness of such a "pocket" phantom for patient-based calibration of computed tomography (alone or with PET systems. The ability to calibrate diagnostic imaging tools in a way that is traceable to national standards will lead to a more quantitative approach; both physician and patient benefit from increased accuracy in treatment planning, as well as increased safety for the patient.

  10. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  11. The Earthquake‐Source Inversion Validation (SIV) Project

    Science.gov (United States)

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  12. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    Science.gov (United States)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and

  13. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  14. Teaching Quantitative Reasoning: A Better Context for Algebra

    Directory of Open Access Journals (Sweden)

    Eric Gaze

    2014-01-01

    Full Text Available This editorial questions the preeminence of algebra in our mathematics curriculum. The GATC (Geometry, Algebra, Trigonometry, Calculus sequence abandons the fundamental middle school math topics necessary for quantitative literacy, while the standard super-abundance of algebra taught in the abstract fosters math phobia and supports a culturally acceptable stance that math is not relevant to everyday life. Although GATC is seen as a pipeline to STEM (Science, Technology, Engineering, Mathematics, it is a mistake to think that the objective of producing quantitatively literate citizens is at odds with creating more scientists and engineers. The goal must be to create a curriculum that addresses the quantitative reasoning needs of all students, providing meaningful engagement in mathematics that will simultaneously develop quantitative literacy and spark an interest in STEM fields. In my view, such a curriculum could be based on a foundation of proportional reasoning leading to higher-order quantitative reasoning via modeling (including algebraic reasoning and problem solving and statistical literacy (through the exploration and study of data.

  15. Quantitative susceptibility mapping of human brain at 3T: a multisite reproducibility study.

    Science.gov (United States)

    Lin, P-Y; Chao, T-C; Wu, M-L

    2015-03-01

    Quantitative susceptibility mapping of the human brain has demonstrated strong potential in examining iron deposition, which may help in investigating possible brain pathology. This study assesses the reproducibility of quantitative susceptibility mapping across different imaging sites. In this study, the susceptibility values of 5 regions of interest in the human brain were measured on 9 healthy subjects following calibration by using phantom experiments. Each of the subjects was imaged 5 times on 1 scanner with the same procedure repeated on 3 different 3T systems so that both within-site and cross-site quantitative susceptibility mapping precision levels could be assessed. Two quantitative susceptibility mapping algorithms, similar in principle, one by using iterative regularization (iterative quantitative susceptibility mapping) and the other with analytic optimal solutions (deterministic quantitative susceptibility mapping), were implemented, and their performances were compared. Results show that while deterministic quantitative susceptibility mapping had nearly 700 times faster computation speed, residual streaking artifacts seem to be more prominent compared with iterative quantitative susceptibility mapping. With quantitative susceptibility mapping, the putamen, globus pallidus, and caudate nucleus showed smaller imprecision on the order of 0.005 ppm, whereas the red nucleus and substantia nigra, closer to the skull base, had a somewhat larger imprecision of approximately 0.01 ppm. Cross-site errors were not significantly larger than within-site errors. Possible sources of estimation errors are discussed. The reproducibility of quantitative susceptibility mapping in the human brain in vivo is regionally dependent, and the precision levels achieved with quantitative susceptibility mapping should allow longitudinal and multisite studies such as aging-related changes in brain tissue magnetic susceptibility. © 2015 by American Journal of Neuroradiology.

  16. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  17. Field Measurements of Trace Gases and Aerosols Emitted by Undersampled Combustion Sources Including Wood and Dung Cooking Fires, Garbage and Crop Residue Burning, and Indonesian Peat Fires

    Science.gov (United States)

    Stockwell, C.; Jayarathne, T. S.; Goetz, D.; Simpson, I. J.; Selimovic, V.; Bhave, P.; Blake, D. R.; Cochrane, M. A.; Ryan, K. C.; Putra, E. I.; Saharjo, B.; Stone, E. A.; DeCarlo, P. F.; Yokelson, R. J.

    2017-12-01

    Field measurements were conducted in Nepal and in the Indonesian province of Central Kalimantan to improve characterization of trace gases and aerosols emitted by undersampled combustion sources. The sources targeted included cooking with a variety of stoves, garbage burning, crop residue burning, and authentic peat fires. Trace gas and aerosol emissions were studied using a land-based Fourier transform infrared spectrometer, whole air sampling, photoacoustic extinctiometers (405 and 870nm), and filter samples that were analyzed off-line. These measurements were used to calculate fuel-based emission factors (EFs) for up to 90 gases, PM2.5, and PM2.5 constituents. The aerosol optical data measured included EFs for the scattering and absorption coefficients, the single scattering albedo (at 870 and 405 nm), as well as the absorption Ångström exponent. The emissions varied significantly by source, although light absorption by both brown and black carbon (BrC and BC, respectively) was important for all non-peat sources. For authentic peat combustion, the emissions of BC were negligible and absorption was dominated by organic aerosol. The field results from peat burning were in reasonable agreement with recent lab measurements of smoldering Kalimantan peat and compare well to the limited data available from other field studies. The EFs can be used with estimates of fuel consumption to improve regional emissions inventories and assessments of the climate and health impacts of these undersampled sources.

  18. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  19. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  20. High-performance control of a three-phase voltage-source converter including feedforward compensation of the estimated load current

    International Nuclear Information System (INIS)

    Leon, Andres E.; Solsona, Jorge A.; Busada, Claudio; Chiacchiarini, Hector; Valla, Maria Ines

    2009-01-01

    In this paper a new control strategy for voltage-source converters (VSC) is introduced. The proposed strategy consists of a nonlinear feedback controller based on feedback linearization plus a feedforward compensation of the estimated load current. In our proposal an energy function and the direct-axis current are considered as outputs, in order to avoid the internal dynamics. In this way, a full linearization is obtained via nonlinear transformation and feedback. An estimate of the load current is feedforwarded to improve the performance of the whole system and to diminish the capacitor size. This estimation allows to obtain a more rugged and cheaper implementation. The estimate is calculated by using a nonlinear reduced-order observer. The proposal is validated through different tests. These tests include performance in presence of switching frequency, measurement filters delays, parameters uncertainties and disturbances in the input voltage.

  1. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  2. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  3. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  4. Assessment of air, water and land-based sources of pollution in the ...

    African Journals Online (AJOL)

    A quantitative assessment of air, water and land-based sources of pollution to the coastal zone of the Accra-Tema Metropolitan Area of Ghana was conducted by making an emission inventory from information on industrial, commercial and domestic activities. Three sources of air pollution were analysed, viz, emission from ...

  5. Pacemakers lower sources

    International Nuclear Information System (INIS)

    Greatbatch, W.

    1984-01-01

    Energy sources for cardiac facing are considered including radioisotope sources, in a broad conceptual and historical framework.The main guidelines for future development of energy sources are assessed

  6. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  7. Quantitative Literacy at Michigan State University, 2: Connection to Financial Literacy

    Directory of Open Access Journals (Sweden)

    Dennis Gilliland

    2011-07-01

    Full Text Available The lack of capability of making financial decisions has been recently described for the adult United States population. A concerted effort to increase awareness of this crisis, to improve education in quantitative and financial literacy, and to simplify financial decision-making processes is critical to the solution. This paper describes a study that was undertaken to explore the relationship between quantitative literacy and financial literacy for entering college freshmen. In summer 2010, incoming freshmen to Michigan State University were assessed. Well-tested financial literacy items and validated quantitative literacy assessment instruments were administered to 531 subjects. Logistic regression models were used to assess the relationship between level of financial literacy and independent variables including quantitative literacy score, ACT mathematics score, and demographic variables including gender. The study establishes a strong positive association between quantitative literacy and financial literacy on top of the effects of the other independent variables. Adding one percent to the performance on a quantitative literacy assessment changes the odds for being at the highest level of financial literacy by a factor estimated to be 1.05. Gender is found to have a large, statistically significant effect as well with being female changing the odds by a factor estimated to be 0.49.

  8. Quantitative analyses of the 3D nuclear landscape recorded with super-resolved fluorescence microscopy.

    Science.gov (United States)

    Schmid, Volker J; Cremer, Marion; Cremer, Thomas

    2017-07-01

    Recent advancements of super-resolved fluorescence microscopy have revolutionized microscopic studies of cells, including the exceedingly complex structural organization of cell nuclei in space and time. In this paper we describe and discuss tools for (semi-) automated, quantitative 3D analyses of the spatial nuclear organization. These tools allow the quantitative assessment of highly resolved different chromatin compaction levels in individual cell nuclei, which reflect functionally different regions or sub-compartments of the 3D nuclear landscape, and measurements of absolute distances between sites of different chromatin compaction. In addition, these tools allow 3D mapping of specific DNA/RNA sequences and nuclear proteins relative to the 3D chromatin compaction maps and comparisons of multiple cell nuclei. The tools are available in the free and open source R packages nucim and bioimagetools. We discuss the use of masks for the segmentation of nuclei and the use of DNA stains, such as DAPI, as a proxy for local differences in chromatin compaction. We further discuss the limitations of 3D maps of the nuclear landscape as well as problems of the biological interpretation of such data. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Approaches to quantitative risk assessment with applications to PP

    International Nuclear Information System (INIS)

    Geiger, G.; Schaefer, A.

    2002-01-01

    Full text: Experience with accidents such as Goiania in Brazil and indications of a considerable number of orphan sources suggest that improved protection would be desirable for some types of radioactive material of wide-spread use such as radiation sources for civil purposes. Regarding large potential health and economic consequences (in particular, if terrorists attacks cannot be excluded), significant costs of preventive actions, and large uncertainties about both the likelihood of occurrence and the potential consequences of PP safety and security incidents, an optimum relationship between preventive and mitigative efforts is likely to be a key issue for successful risk management in this field. Thus, possible violations of physical protection combined with threats of misuse of nuclear materials, including terrorist attack, pose considerable challenges to global security from various perspectives. In view of these challenges, recent advance in applied risk and decision analysis suggests methodological and procedural improvements in quantitative risk assessment, the demarcation of acceptable risk, and risk management. Advance is based on a recently developed model of optimal risky choice suitable for assessing and comparing the cumulative probability distribution functions attached to safety and security risks. Besides quantification of risk (e. g., in economic terms), the standardization of various risk assessment models frequently used in operations research can be approached on this basis. The paper explores possible applications of these improved methods to the safety and security management of nuclear materials, cost efficiency of risk management measures, and the establishment international safety and security standards of PP. Examples will be presented that are based on selected scenarios of misuse involving typical radioactive sources. (author)

  10. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    Science.gov (United States)

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  11. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  12. Localization of the gamma-radiation sources using the gamma-visor

    Directory of Open Access Journals (Sweden)

    Ivanov Kirill E.

    2008-01-01

    Full Text Available The search of the main gamma-radiation sources at the site of the temporary storage of solid radioactive wastes was carried out. The relative absorbed dose rates were measured for some of the gamma-sources before and after the rehabilitation procedures. The effectiveness of the rehabilitation procedures in the years 2006-2007 was evaluated qualitatively and quantitatively. The decrease of radiation background at the site of the temporary storage of the solid radioactive wastes after the rehabilitation procedures allowed localizing the new gamma-source.

  13. Localization of the gamma-radiation sources using the gamma-visor

    International Nuclear Information System (INIS)

    Ivanov, K. E.; Ponomaryev-Stepnoi, N. N.; Stepennov, B. S.; Teterin, Y. A.; Teterin, A. Y.; Kharitonov, V. V.

    2008-01-01

    The search of the main gamma-radiation sources at the site of the temporary storage of solid radioactive wastes was carried out. The relative absorbed dose rates were measured for some of the gamma-sources before and after the rehabilitation procedures. The effectiveness of the rehabilitation procedures in the years 2006-2007 was evaluated qualitatively and quantitatively. The decrease of radiation background at the site of the temporary storage of the solid radioactive wastes after the rehabilitation procedures al lowed localizing the new gamma-source. (author)

  14. Liquid-metal-jet anode electron-impact x-ray source

    International Nuclear Information System (INIS)

    Hemberg, O.; Otendal, M.; Hertz, H.M.

    2003-01-01

    We demonstrate an anode concept, based on a liquid-metal jet, for improved brightness in compact electron-impact x-ray sources. The source is demonstrated in a proof-of-principle experiment where a 50 keV, ∼100 W electron beam is focused on a 75 μm liquid-solder jet. The generated x-ray flux and brightness is quantitatively measured in the 7-50 keV spectral region and found to agree with theory. Compared to rotating-anode sources, whose brightness is limited by intrinsic thermal properties, the liquid-jet anode could potentially be scaled to achieve a brightness >100x higher than current state-of-the-art sources. Applications such as mammography, angiography, and diffraction would benefit from such a compact high-brightness source

  15. Quantitative x-ray dark-field computed tomography

    International Nuclear Information System (INIS)

    Bech, M; Pfeiffer, F; Bunk, O; Donath, T; David, C; Feidenhans'l, R

    2010-01-01

    The basic principles of x-ray image formation in radiology have remained essentially unchanged since Roentgen first discovered x-rays over a hundred years ago. The conventional approach relies on x-ray attenuation as the sole source of contrast and draws exclusively on ray or geometrical optics to describe and interpret image formation. Phase-contrast or coherent scatter imaging techniques, which can be understood using wave optics rather than ray optics, offer ways to augment or complement the conventional approach by incorporating the wave-optical interaction of x-rays with the specimen. With a recently developed approach based on x-ray optical gratings, advanced phase-contrast and dark-field scatter imaging modalities are now in reach for routine medical imaging and non-destructive testing applications. To quantitatively assess the new potential of particularly the grating-based dark-field imaging modality, we here introduce a mathematical formalism together with a material-dependent parameter, the so-called linear diffusion coefficient and show that this description can yield quantitative dark-field computed tomography (QDFCT) images of experimental test phantoms.

  16. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  17. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    Czech Academy of Sciences Publication Activity Database

    Stefan, D.; Di Cesare, F.; Andrasescu, A.; Popa, E.; Lazariev, A.; Vescovo, E.; Štrbák, Oliver; Williams, S.; Starčuk jr., Zenon; Cabanas, M.; van Ormondt, D.; Graveron-Demilly, D.

    2009-01-01

    Roč. 20, č. 10 (2009), 104035:1-9 ISSN 0957-0233 Grant - others:EC 6FP(XE) MRTN-CT-2006-035801 Source of funding: R - rámcový projekt EK Keywords : MR spectroscopy * MRS * MRSI * HRMAS-NMR * jMRUI software package * Java * plug-ins * quantitation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.317, year: 2009

  18. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  19. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model.

    Science.gov (United States)

    Hiatt, Jessica R; Davis, Stephen D; Rivard, Mark J

    2015-06-01

    The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10(10) histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an (125)I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose function ratio between the current

  20. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  1. Effects of formic acid hydrolysis on the quantitative analysis of radiation-induced DNA base damage products assayed by gas chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Swarts, S.G.; Smith, G.S.; Miao, L.; Wheeler, K.T.

    1996-01-01

    Gas chromatography/mass spectrometry (GC/ MS-SIM) is an excellent technique for performing both qualitative and quantitative analysis of DNA base damage products that are formed by exposure to ionizing radiation or by the interaction of intracellular DNA with activated oxygen species. This technique commonly uses a hot formic acid hydrolysis step to degrade the DNA to individual free bases. However, due to the harsh nature of this degradation procedure, the quantitation of DNA base damage products may be adversely affected. Consequently, we examined the effects of various formic acid hydrolysis procedures on the quantitation of a number of DNA base damage products and identified several factors that can influence this quantitation. These factors included (1) the inherent acid stabilities of both the lesions and the internal standards; (2) the hydrolysis temperature; (3) the source and grade of the formic acid; and (4) the sample mass during hydrolysis. Our data also suggested that the N, O-bis (trimethylsilyl)trifluoroacetamide (BSTFA) derivatization efficiency can be adversely affected, presumably by trace contaminants either in the formic acid or from the acid-activated surface of the glass derivatization vials. Where adverse effects were noted, modifications were explored in an attempt to improve the quantitation of these DNA lesions. Although experimental steps could be taken to minimize the influence of these factors on the quantitation of some base damage products, no single procedure solved the quantitation problem for all base lesions. However, a significant improvement in the quantitation was achieved if the relative molecular response factor (RMRF) values for these lesions were generated with authentic DNA base damage products that had been treated exactly like the experimental samples. (orig.)

  2. Bending of electromagnetic beams and head-tail radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Bodo, G; Ferrari, A; Massaglia, S [Consiglio Nazionale delle Ricerche, Turin (Italy). Lab. di Cosmo-Geofisica; Turin Univ. (Italy). Ist. di Fisica)

    1981-08-01

    An interpretation is presented of bridge bending in head-tail radio sources in the framework of an electromagnetic beam model. The physical effect responsible for the structural distortion is proposed to be the refraction of a large-amplitude wave in a medium with a density gradient perpendicular to the wave propagation vector; this gradient is consistently produced by the relative motion of the beam source in the surrounding medium with a velocity higher than the speed of sound. These effects are calculated in some detail and a quantitative fit of model parameters to the typical radio source associated with NGC 1265 is discussed.

  3. 2π absolute measurement research for α-electroplating source covering ZnS(Ag)

    International Nuclear Information System (INIS)

    Zhu Tianxia

    1999-01-01

    2π absolute measurement can be completed after the quantitative deposit (5 +- 1) mg/cm 2 with ZnS(Ag) on surface of the alpha electroplating source. The measuring efficiency is 100%. This method is suitable for both of electroplating ordinary sample and electroplating standard (of reference) source

  4. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  5. Degradation of the Neonicotinoid Pesticides in the Atmospheric Pressure Ionization Source

    Science.gov (United States)

    Chai, Yunfeng; Chen, Hongping; Liu, Xin; Lu, Chengyin

    2018-02-01

    During the analysis of neonicotinoid pesticide standards (thiamethoxam, clothianidin, imidacloprid, acetamiprid, and thiacloprid) by mass spectrometry, the degradation of these pesticides (M-C=N-R is degraded into M-C=O, M is the skeleton moiety, and R is NO2 or CN) was observed in the atmospheric pressure ionization interfaces (ESI and APCI). In APCI, the degradation of all the five neonicotinoid pesticides studied took place, and the primary mechanism was in-source ion/molecule reaction, in which a molecule of water (confirmed by use of H2 18O) attacked the carbon of the imine group accompanying with loss of NH2R (R=NO2, CN). For the nitroguanidine neonicotinoid pesticides (R=NO2, including thiamethoxam, clothianidin, and imidacloprid), higher auxiliary gas heater temperature also contributed to their degradation in APCI due to in-source pyrolysis. The degradation of the five neonicotinoid pesticides studied in ESI was not significant. In ESI, only the nitroguanidine neonicotinoid pesticides could generate the degradation products through in-source fragmentation mechanism. The degradation of cyanoamidine neonicotinoid pesticides (R=CN, including acetamiprid and thiacloprid) in ESI was not observed. The degradation of neonicotinoid pesticides in the ion source of mass spectrometer renders some adverse consequences, such as difficulty interpreting the full-scan mass spectrum, reducing the sensitivity and accuracy of quantitative analysis, and misleading whether these pesticides have degraded in the real samples. Therefore, a clear understanding of these unusual degradation reactions should facilitate the analysis of neonicotinoid pesticides by atmospheric pressure ionization mass spectrometry.

  6. Degradation of the Neonicotinoid Pesticides in the Atmospheric Pressure Ionization Source.

    Science.gov (United States)

    Chai, Yunfeng; Chen, Hongping; Liu, Xin; Lu, Chengyin

    2018-02-01

    During the analysis of neonicotinoid pesticide standards (thiamethoxam, clothianidin, imidacloprid, acetamiprid, and thiacloprid) by mass spectrometry, the degradation of these pesticides (M-C=N-R is degraded into M-C=O, M is the skeleton moiety, and R is NO 2 or CN) was observed in the atmospheric pressure ionization interfaces (ESI and APCI). In APCI, the degradation of all the five neonicotinoid pesticides studied took place, and the primary mechanism was in-source ion/molecule reaction, in which a molecule of water (confirmed by use of H 2 18 O) attacked the carbon of the imine group accompanying with loss of NH 2 R (R=NO 2 , CN). For the nitroguanidine neonicotinoid pesticides (R=NO 2 , including thiamethoxam, clothianidin, and imidacloprid), higher auxiliary gas heater temperature also contributed to their degradation in APCI due to in-source pyrolysis. The degradation of the five neonicotinoid pesticides studied in ESI was not significant. In ESI, only the nitroguanidine neonicotinoid pesticides could generate the degradation products through in-source fragmentation mechanism. The degradation of cyanoamidine neonicotinoid pesticides (R=CN, including acetamiprid and thiacloprid) in ESI was not observed. The degradation of neonicotinoid pesticides in the ion source of mass spectrometer renders some adverse consequences, such as difficulty interpreting the full-scan mass spectrum, reducing the sensitivity and accuracy of quantitative analysis, and misleading whether these pesticides have degraded in the real samples. Therefore, a clear understanding of these unusual degradation reactions should facilitate the analysis of neonicotinoid pesticides by atmospheric pressure ionization mass spectrometry. Graphical Abstract.

  7. Processing of transmission data from an uncollimated single photon source

    International Nuclear Information System (INIS)

    Dikaios, N.; Dinelle, K.; Spinks, T.; Nikita, K.; Thielemans, K.

    2006-01-01

    The EXACT 3D PET scanner uses a Cs-137 single photon rotating point source for the transmission scan. As the source is un-collimated, the transmission data are contaminated by scatter. It has been suggested that segmentation of the reconstructed image can restore the quantitative information in the image. We study here if the results can be further improved by the application of a scale factor for every transaxial plane

  8. Beyond nutrient-based food indices: a data mining approach to search for a quantitative holistic index reflecting the degree of food processing and including physicochemical properties.

    Science.gov (United States)

    Fardet, Anthony; Lakhssassi, Sanaé; Briffaz, Aurélien

    2018-01-24

    Processing has major impacts on both the structure and composition of food and hence on nutritional value. In particular, high consumption of ultra-processed foods (UPFs) is associated with increased risks of obesity and diabetes. Unfortunately, existing food indices only focus on food nutritional content while failing to consider either food structure or the degree of processing. The objectives of this study were thus to link non-nutrient food characteristics (texture, water activity (a w ), glycemic and satiety potentials (FF), and shelf life) to the degree of processing; search for associations between these characteristics with nutritional composition; search for a holistic quantitative technological index; and determine quantitative rules for a food to be defined as UPF using data mining. Among the 280 most widely consumed foods by the elderly in France, 139 solid/semi-solid foods were selected for textural and a w measurements, and classified according to three degrees of processing. Our results showed that minimally-processed foods were less hyperglycemic, more satiating, had better nutrient profile, higher a w , shorter shelf life, lower maximum stress, and higher energy at break than UPFs. Based on 72 food variables, multivariate analyses differentiated foods according to their degree of processing. Then technological indices including food nutritional composition, a w , FF and textural parameters were tested against technological groups. Finally, a LIM score (nutrients to limit) ≥8 per 100 kcal and a number of ingredients/additives >4 are relevant, but not sufficient, rules to define UPFs. We therefore suggest that food health potential should be first defined by its degree of processing.

  9. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  10. 26 CFR 31.3402(e)-1 - Included and excluded wages.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Included and excluded wages. 31.3402(e)-1... SOURCE Collection of Income Tax at Source § 31.3402(e)-1 Included and excluded wages. (a) If a portion of... not more than 31 consecutive days constitutes wages, and the remainder does not constitute wages, all...

  11. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  12. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  13. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    Science.gov (United States)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  14. Quantitative X-ray microtomography with synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Donath, T. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Materialforschung

    2007-07-01

    Synchrotron-radiation-based computed microtomography (SR{sub {mu}}CT) is an established method for the examination of volume structures. It allows to measure the x-ray attenuation coefficient of a specimen three-dimensionally with a spatial resolution of about one micrometer. In contrast to conventional x-ray sources (x-ray tubes), the unique properties of synchrotron radiation enable quantitative measurements that do not suffer from beam-hardening artifacts. During this work the capabilities for quantitative SR{sub {mu}}CT measurements have been further improved by enhancements that were made to the SR{sub {mu}}CT apparatus and to the reconstruction chain. For high-resolution SR{sub {mu}}CT an x-ray camera consisting of luminescent screen (x-ray phosphor), lens system, and CCD camera was used. A significant suppression of blur that is caused by reflections inside the luminescent screen could be achieved by application of an absorbing optical coating to the screen surface. It is shown that blur and ring artifacts in the tomographic reconstructions are thereby drastically reduced. Furthermore, a robust and objective method for the determination of the center of rotation in projection data (sinograms) is presented that achieves sub-pixel precision. By implementation of this method into the reconstruction chain, complete automation of the reconstruction process has been achieved. Examples of quantitative SR{sub {mu}}CT studies conducted at the Hamburger Synchrotronstrahlungslabor HASYLAB at the Deutsches Elektronen-Synchrotron DESY are presented and used for the demonstration of the achieved enhancements. (orig.)

  15. Quantitative X-ray microtomography with synchrotron radiation

    International Nuclear Information System (INIS)

    Donath, T.

    2007-01-01

    Synchrotron-radiation-based computed microtomography (SR μ CT) is an established method for the examination of volume structures. It allows to measure the x-ray attenuation coefficient of a specimen three-dimensionally with a spatial resolution of about one micrometer. In contrast to conventional x-ray sources (x-ray tubes), the unique properties of synchrotron radiation enable quantitative measurements that do not suffer from beam-hardening artifacts. During this work the capabilities for quantitative SR μ CT measurements have been further improved by enhancements that were made to the SR μ CT apparatus and to the reconstruction chain. For high-resolution SR μ CT an x-ray camera consisting of luminescent screen (x-ray phosphor), lens system, and CCD camera was used. A significant suppression of blur that is caused by reflections inside the luminescent screen could be achieved by application of an absorbing optical coating to the screen surface. It is shown that blur and ring artifacts in the tomographic reconstructions are thereby drastically reduced. Furthermore, a robust and objective method for the determination of the center of rotation in projection data (sinograms) is presented that achieves sub-pixel precision. By implementation of this method into the reconstruction chain, complete automation of the reconstruction process has been achieved. Examples of quantitative SR μ CT studies conducted at the Hamburger Synchrotronstrahlungslabor HASYLAB at the Deutsches Elektronen-Synchrotron DESY are presented and used for the demonstration of the achieved enhancements. (orig.)

  16. Pseudo-dynamic source modelling with 1-point and 2-point statistics of earthquake source parameters

    KAUST Repository

    Song, S. G.

    2013-12-24

    Ground motion prediction is an essential element in seismic hazard and risk analysis. Empirical ground motion prediction approaches have been widely used in the community, but efficient simulation-based ground motion prediction methods are needed to complement empirical approaches, especially in the regions with limited data constraints. Recently, dynamic rupture modelling has been successfully adopted in physics-based source and ground motion modelling, but it is still computationally demanding and many input parameters are not well constrained by observational data. Pseudo-dynamic source modelling keeps the form of kinematic modelling with its computational efficiency, but also tries to emulate the physics of source process. In this paper, we develop a statistical framework that governs the finite-fault rupture process with 1-point and 2-point statistics of source parameters in order to quantify the variability of finite source models for future scenario events. We test this method by extracting 1-point and 2-point statistics from dynamically derived source models and simulating a number of rupture scenarios, given target 1-point and 2-point statistics. We propose a new rupture model generator for stochastic source modelling with the covariance matrix constructed from target 2-point statistics, that is, auto- and cross-correlations. Our sensitivity analysis of near-source ground motions to 1-point and 2-point statistics of source parameters provides insights into relations between statistical rupture properties and ground motions. We observe that larger standard deviation and stronger correlation produce stronger peak ground motions in general. The proposed new source modelling approach will contribute to understanding the effect of earthquake source on near-source ground motion characteristics in a more quantitative and systematic way.

  17. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  18. Potential Impacts of Food Production on Freshwater Availability Considering Water Sources

    Directory of Open Access Journals (Sweden)

    Shinjiro Yano

    2016-04-01

    Full Text Available We quantify the potential impacts of global food production on freshwater availability (water scarcity footprint; WSF by applying the water unavailability factor (fwua as a characterization factor and a global water resource model based on life cycle impact assessment (LCIA. Each water source, including rainfall, surface water, and groundwater, has a distinct fwua that is estimated based on the renewability rate of each geographical water cycle. The aggregated consumptive water use level for food production (water footprint inventory; WI was found to be 4344 km3/year, and the calculated global total WSF was 18,031 km3 H2Oeq/year, when considering the difference in water sources. According to the fwua concept, which is based on the land area required to obtain a unit volume of water from each source, the calculated annual impact can also be represented as 98.5 × 106 km2. This value implies that current agricultural activities requires a land area that is over six times larger than global total cropland. We also present the net import of the WI and WSF, highlighting the importance of quantitative assessments for utilizing global water resources to achieve sustainable water use globally.

  19. Combination of qualitative and quantitative sources of knowledge for risk assessment in the framework of possibility theory

    NARCIS (Netherlands)

    Oussalah, M.; Newby, M.J.

    2004-01-01

    This paper focuses on a representation of system reliability in the framework of possibility theory. Particularly, given a (probabilistic) quantitative knowledge pertaining to the time to failure of a system (risk function) and some qualitative knowledge about the degree of pessimism and optimism of

  20. Quantitative analysis of semivolatile organic compounds in selected fractions of air sample extracts by GC/MI-IR spectrometry

    International Nuclear Information System (INIS)

    Childers, J.W.; Wilson, N.K.; Barbour, R.K.

    1990-01-01

    The authors are currently investigating the capabilities of gas chromatography/matrix isolation infrared (GC/MI-IR) spectrometry for the determination of semivolatile organic compounds (SVOCs) in environmental air sample extracts. Their efforts are focused on the determination of SVOCs such as alkylbenzene positional isomers, which are difficult to separate chromatographically and to distinguish by conventional electron-impact ionization GC/mass spectrometry. They have performed a series of systematic experiments to identify sources of error in quantitative GC/MI-IR analyses. These experiments were designed to distinguish between errors due to instrument design or performance and errors that arise from some characteristic inherent to the GC/MI-IR technique, such as matrix effects. They have investigated repeatability as a function of several aspects of GC/MI IR spectrometry, including sample injection, spectral acquisition, cryogenic disk movement, and matrix deposition. The precision, linearity, dynamic range, and detection limits of a commercial GC/MI-IR system for target SVOCs were determined and compared to those obtained with the system's flame ionization detector. The use of deuterated internal standards in the quantitative GC/MI-IR analysis of selected fractions of ambient air sample extracts will be demonstrated. They will also discuss the current limitations of the technique in quantitative analyses and suggest improvements for future consideration

  1. Quantitative X-ray microanalysis of biological specimens

    International Nuclear Information System (INIS)

    Roomans, G.M.

    1988-01-01

    Qualitative X-ray microanalysis of biological specimens requires an approach that is somewhat different from that used in the materials sciences. The first step is deconvolution and background subtraction on the obtained spectrum. The further treatment depends on the type of specimen: thin, thick, or semithick. For thin sections, the continuum method of quantitation is most often used, but it should be combined with an accurate correction for extraneous background. However, alternative methods to determine local mass should also be considered. In the analysis of biological bulk specimens, the ZAF-correction method appears to be less useful, primarily because of the uneven surface of biological specimens. The peak-to-local background model may be a more adequate method for thick specimens that are not mounted on a thick substrate. Quantitative X-ray microanalysis of biological specimens generally requires the use of standards that preferably should resemble the specimen in chemical and physical properties. Special problems in biological microanalysis include low count rates, specimen instability and mass loss, extraneous contributions to the spectrum, and preparative artifacts affecting quantitation. A relatively recent development in X-ray microanalysis of biological specimens is the quantitative determination of local water content

  2. Shedding quantitative fluorescence light on novel regulatory mechanisms in skeletal biomedicine and biodentistry.

    Science.gov (United States)

    Lee, Ji-Won; Iimura, Tadahiro

    2017-02-01

    Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.

  3. Quantitative Effectiveness Analysis of Solar Photovoltaic Policies, Introduction of Socio-Feed-in Tariff Mechanism (SocioFIT) and its Implementation in Turkey

    Science.gov (United States)

    Mustafaoglu, Mustafa Sinan

    Some of the main energy issues in developing countries are high dependence on non-renewable energy sources, low energy efficiency levels and as a result of this high amount of CO2 emissions. Besides, a common problem of many countries including developing countries is economic inequality problem. In the study, solar photovoltaic policies of Germany, Japan and the USA is analyzed through a quantitative analysis and a new renewable energy support mechanism called Socio Feed-in Tariff Mechanism (SocioFIT) is formed based on the analysis results to address the mentioned issues of developing countries as well as economic inequality problem by using energy savings as a funding source for renewable energy systems. The applicability of the mechanism is solidified by the calculations in case of an implementation of the mechanism in Turkey.

  4. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  5. Radiation sources working group summary

    International Nuclear Information System (INIS)

    Fazio, M.V.

    1998-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, components technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigation, and phenomena that impact source design such as fatigue in resonant structures due to RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations

  6. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model

    International Nuclear Information System (INIS)

    Hiatt, Jessica R.; Davis, Stephen D.; Rivard, Mark J.

    2015-01-01

    Purpose: The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Methods: Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10 10 histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. Results: The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an 125 I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose function

  7. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model

    Energy Technology Data Exchange (ETDEWEB)

    Hiatt, Jessica R. [Department of Radiation Oncology, Rhode Island Hospital, The Warren Alpert Medical School of Brown University, Providence, Rhode Island 02903 (United States); Davis, Stephen D. [Department of Medical Physics, McGill University Health Centre, Montreal, Quebec H3G 1A4 (Canada); Rivard, Mark J., E-mail: mark.j.rivard@gmail.com [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States)

    2015-06-15

    Purpose: The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Methods: Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10{sup 10} histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. Results: The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an {sup 125}I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose

  8. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  9. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  10. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  11. Quantitative studies with the gamma-camera: correction for spatial and energy distortion

    International Nuclear Information System (INIS)

    Soussaline, F.; Todd-Pokropek, A.E.; Raynaud, C.

    1977-01-01

    The gamma camera sensitivity distribution is an important source of error in quantitative studies. In addition, spatial distortion produces apparent variations in count density which degrades quantitative studies. The flood field image takes into account both effects and is influenced by the pile-up of the tail distribution. It is essential to measure separately each of these parameters. These were investigated using a point source displaced by a special scanning table with two X, Y stepping motors of 10 micron precision. The spatial distribution of the sensitivity, spatial distortion and photopeak in the field of view were measured and compared for different setting-up of the camera and PM gains. For well-tuned cameras, the sensitivity is fairly constant, while the variations appearing in the flood field image are primarily due to spatial distortion, the former more dependent than the latter on the energy window setting. This indicates why conventional flood field uniformity correction must not be applied. A correction technique to improve the results in quantitative studies has been tested using a continuously matched energy window at every point within the field. A method for correcting spatial distortion is also proposed, where, after an adequately sampled measurement of this error, a transformation can be applied to calculate the true position of events. The knowledge of the magnitude of these parameters is essential in the routine use and design of detector systems

  12. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  13. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  14. A quantitative reading of competences documents of Law new degrees.

    OpenAIRE

    Leví Orta, Genoveva del Carmen; Ramos Méndez, Eduardo

    2014-01-01

    Documents formulating competences of degrees are key sources for analysis, evaluation and profile comparison of training, currently offered by different university degrees. This work aims to make a quantitative reading of competences documents of Law degree from various Spanish universities, based on the ideas of Content Analysis. The methodology has two phases. Firstly, a dictionary of concepts related to the components of competences is identified in the documentary corpus. Next, the corpus...

  15. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  16. 13 CFR 120.102 - Funds not available from alternative sources, including personal resources of principals.

    Science.gov (United States)

    2010-01-01

    ... source) when that owner's liquid assets exceed the amounts specified in paragraphs (a) (1) through (3) of... applicant must inject any personal liquid assets which are in excess of two times the total financing... the applicant must inject any personal liquid assets which are in excess of one and one-half times the...

  17. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  18. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  19. A large source of low-volatility secondary organic aerosol

    DEFF Research Database (Denmark)

    Ehn, Mikael; Thornton, Joel A.; Kleist, Einhard

    2014-01-01

    radiation and by acting as cloud condensation nuclei. The quantitative assessment of such climate effects remains hampered by a number of factors, including an incomplete understanding of how biogenic VOCs contribute to the formation of atmospheric secondary organic aerosol. The growth of newly formed...... particles from sizes of less than three nanometres up to the sizes of cloud condensation nuclei (about one hundred nanometres) in many continental ecosystems requires abundant, essentially non-volatile organic vapours, but the sources and compositions of such vapours remain unknown. Here we investigate...... the oxidation of VOCs, in particular the terpene α-pinene, under atmospherically relevant conditions in chamber experiments. We find that a direct pathway leads from several biogenic VOCs, such as monoterpenes, to the formation of large amounts of extremely low-volatility vapours. These vapours form...

  20. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  1. Clinical utility of anterior segment swept-source optical coherence tomography in glaucoma

    Directory of Open Access Journals (Sweden)

    Dewang Angmo

    2016-01-01

    Full Text Available Optical coherence tomography (OCT, a noninvasive imaging modality that uses low-coherence light to obtain a high-resolution cross-section of biological structures, has evolved dramatically over the years. The Swept-source OCT (SS-OCT makes use of a single detector with a rapidly tunable laser as a light source. The Casia SS-1000 OCT is a Fourier-domain, SS-OCT designed specifically for imaging the anterior segment. This system achieves high resolution imaging of 10΅m (Axial and 30΅m (Transverse and high speed scanning of 30,000 A-scans per second. With a substantial improvement in scan speed, the anterior chamber angles can be imaged 360 degrees in 128 cross sections (each with 512 A-scans in about 2.4 seconds. We summarize the clinical applications of anterior segment SS-OCT in Glaucoma. Literature search: We searched PubMed and included Medline using the phrases anterior segment optical coherence tomography in ophthalmology, swept-source OCT, use of AS-OCT in glaucoma, use of swept-source AS-OCT in glaucoma, quantitative assessment of angle, filtering bleb in AS-OCT, comparison of AS-OCT with gonioscopy and comparison of AS-OCT with UBM. Search was made for articles dating 1990 to August 2015.

  2. Bending of electromagnetic beams and head-tail radio sources

    International Nuclear Information System (INIS)

    Bodo, G.; Ferrari, A.; Massaglia, S.; Turin Univ.

    1981-01-01

    An interpretation is presented of bridge bending in head-tail radio sources in the framework of an electromagnetic beam model. The physical effect responsible for the structural distortion is proposed to be the refraction of a large-amplitude wave in a medium with a density gradient perpendicular to the wave propagation vector; this gradient is consistently produced by the relative motion of the beam source in the surrounding medium with a velocity higher than the speed of sound. These effects are calculated in some detail and a quantitative fit of model parameters to the typical radio source associated with NGC 1265 is discussed. (author)

  3. Off-design performance analysis of Kalina cycle for low temperature geothermal source

    International Nuclear Information System (INIS)

    Li, Hang; Hu, Dongshuai; Wang, Mingkun; Dai, Yiping

    2016-01-01

    Highlights: • The off-design performance analysis of Kalina cycle is conducted. • The off-design models are established. • The genetic algorithm is used in the design phase. • The sliding pressure control strategy is applied. - Abstract: Low temperature geothermal sources with brilliant prospects have attracted more and more people’s attention. Kalina cycle system using ammonia water as working fluid could exploit geothermal energy effectively. In this paper, the quantitative analysis of off-design performance of Kalina cycle for the low temperature geothermal source is conducted. The off-design models including turbine, pump and heat exchangers are established preliminarily. Genetic algorithm is used to maximize the net power output and determine the thermodynamic parameters in the design phase. The sliding pressure control strategy applied widely in existing Rankine cycle power plants is adopted to response to the variations of geothermal source mass flow rate ratio (70–120%), geothermal source temperature (116–128 °C) and heat sink temperature (0–35 °C). In the off-design research scopes, the guidance for pump rotational speed adjustment is listed to provide some reference for off-design operation of geothermal power plants. The required adjustment rate of pump rotational speed is more sensitive to per unit geothermal source temperature than per unit heat sink temperature. Influence of the heat sink variation is greater than that of the geothermal source variation on the ranges of net power output and thermal efficiency.

  4. Public and patient involvement in quantitative health research: A statistical perspective.

    Science.gov (United States)

    Hannigan, Ailish

    2018-06-19

    The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.

  5. Genotypic and phenotypic diversity of Ralstonia pickettii and Ralstonia insidiosa isolates from clinical and environmental sources including High-purity Water.

    LENUS (Irish Health Repository)

    Ryan, Michael P

    2011-08-30

    Abstract Background Ralstonia pickettii is a nosocomial infectious agent and a significant industrial contaminant. It has been found in many different environments including clinical situations, soil and industrial High Purity Water. This study compares the phenotypic and genotypic diversity of a selection of strains of Ralstonia collected from a variety of sources. Results Ralstonia isolates (fifty-nine) from clinical, industrial and environmental origins were compared genotypically using i) Species-specific-PCR, ii) PCR and sequencing of the 16S-23S rRNA Interspatial region (ISR) iii) the fliC gene genes, iv) RAPD and BOX-PCR and v) phenotypically using biochemical testing. The species specific-PCR identified fifteen out of fifty-nine designated R. pickettii isolates as actually being the closely related species R. insidiosa. PCR-ribotyping of the 16S-23S rRNA ISR indicated few major differences between the isolates. Analysis of all isolates demonstrated different banding patterns for both the RAPD and BOX primers however these were found not to vary significantly. Conclusions R. pickettii species isolated from wide geographic and environmental sources appear to be reasonably homogenous based on genotypic and phenotypic characteristics. R. insidiosa can at present only be distinguished from R. pickettii using species specific PCR. R. pickettii and R. insidiosa isolates do not differ significantly phenotypically or genotypically based on environmental or geographical origin.

  6. Quantitative Compton suppression spectrometry at elevated counting rates

    International Nuclear Information System (INIS)

    Westphal, G.P.; Joestl, K.; Schroeder, P.; Lauster, R.; Hausch, E.

    1999-01-01

    For quantitative Compton suppression spectrometry the decrease of coincidence efficiency with counting rate should be made negligible to avoid a virtual increase of relative peak areas of coincident isomeric transitions with counting rate. To that aim, a separate amplifier and discriminator has been used for each of the eight segments of the active shield of a new well-type Compton suppression spectrometer, together with an optimized, minimum dead-time design of the anticoincidence logic circuitry. Chance coincidence losses in the Compton suppression spectrometer are corrected instrumentally by comparing the chance coincidence rate to the counting rate of the germanium detector in a pulse-counting Busy circuit (G.P. Westphal, J. Rad. Chem. 179 (1994) 55) which is combined with the spectrometer's LFC counting loss correction system. The normally not observable chance coincidence rate is reconstructed from the rates of germanium detector and scintillation detector in an auxiliary coincidence unit, after the destruction of true coincidence by delaying one of the coincidence partners. Quantitative system response has been tested in two-source measurements with a fixed reference source of 60 Co of 14 kc/s, and various samples of 137 Cs, up to aggregate counting rates of 180 kc/s for the well-type detector, and more than 1400 kc/s for the BGO shield. In these measurements, the net peak areas of the 1173.3 keV line of 60 Co remained constant at typical values of 37 000 with and 95 000 without Compton suppression, with maximum deviations from the average of less than 1.5%

  7. Projects of SR sources including research and development for insertion devices in the USSR

    International Nuclear Information System (INIS)

    Kulipanov, G.

    1990-01-01

    Some technical information on the electron and positron storage rings - SR sources that are being constructed, used or developed at the Novosibirsk Institute of Nuclear Physics (INP), is given. The parameters and construction of wigglers and undulators (electromagnetic, superconducting, and based on permanent magnets) that are intended to be used at such storage rings are described. Various schemes of installation of wigglers, undulators and FEL at storage rings is considered. The ways of minimizing the influence of their magnetic fields on particle motion in storage rings are treated. (author)

  8. Optimized protein extraction for quantitative proteomics of yeasts.

    Directory of Open Access Journals (Sweden)

    Tobias von der Haar

    2007-10-01

    Full Text Available The absolute quantification of intracellular protein levels is technically demanding, but has recently become more prominent because novel approaches like systems biology and metabolic control analysis require knowledge of these parameters. Current protocols for the extraction of proteins from yeast cells are likely to introduce artifacts into quantification procedures because of incomplete or selective extraction.We have developed a novel procedure for protein extraction from S. cerevisiae based on chemical lysis and simultaneous solubilization in SDS and urea, which can extract the great majority of proteins to apparent completeness. The procedure can be used for different Saccharomyces yeast species and varying growth conditions, is suitable for high-throughput extraction in a 96-well format, and the resulting extracts can easily be post-processed for use in non-SDS compatible procedures like 2D gel electrophoresis.An improved method for quantitative protein extraction has been developed that removes some of the sources of artefacts in quantitative proteomics experiments, while at the same time allowing novel types of applications.

  9. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  10. Input for seismic hazard assessment using Vrancea seismic source region

    International Nuclear Information System (INIS)

    Ivan, Iren-Adelina; Enescu, B.D.; Pantea, A.

    1998-01-01

    We use an extended and combined data base including historical and modern, qualitative and quantitative data, i.e., more than 25 events during the period 1790 - 1990 with epicentral/maximum intensities ranging from X to V degree (MSK scale), the variation interval of isoseismal curves ranging from IX th to III rd degree. The data set was analysed using both the sum phasor techniques of Ridelek and Sacks (1984) for different magnitudes and depth intervals and the Stepp's method. For the assessment of seismic hazard we need a pattern of seismic source regions including an estimation for the maximum expected magnitude and the return period for the studied regions. Another necessary step in seismic hazard assessment is to develop attenuation relationships specific to a seismogenic zone, particularly to sub-crustal earthquakes of Vrancea region. The conceptual frame involves the use of appropriate decay models and consideration of the randomness in the attenuation, taking into account the azimuthal variation of the isoseist shapes. (authors)

  11. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  12. Advancing the Fork detector for quantitative spent nuclear fuel verification

    Science.gov (United States)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  13. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  14. A framework for sourcing of evaporation between saturated and unsaturated zone in bare soil condition

    OpenAIRE

    Balugani, E.; Lubczynski, M.W.; Metselaar, Klaas

    2016-01-01

    Sourcing subsurface evaporation (Ess) into groundwater (Eg) and unsaturated zone (Eu) components has received little scientific attention so far, despite its importance in water management and agriculture. We propose a novel sourcing framework, with its implementation in dedicated post-processing software called SOURCE (used along with the HYDRUS1D model), to study evaporation sourcing dynamics, define quantitatively “shallow” and “deep” water table conditions and test the applicability of wa...

  15. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  16. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  17. 1st International Congress on Actuarial Science and Quantitative Finance

    CERN Document Server

    Garrido, José; Hernández-Hernández, Daniel; ICASQF

    2015-01-01

    Featuring contributions from industry and academia, this volume includes chapters covering a diverse range of theoretical and empirical aspects of actuarial science and quantitative finance, including portfolio management, derivative valuation, risk theory and the economics of insurance. Developed from the First International Congress on Actuarial Science and Quantitative Finance, held at the Universidad Nacional de Colombia in Bogotá in June 2014, this volume highlights different approaches to issues arising from industries in the Andean and Carribean regions. Contributions address topics such as Reverse mortgage schemes and urban dynamics, modeling spot price dynamics in the electricity market, and optimizing calibration and pricing with SABR models.

  18. Detection of aeroacoustic sound sources on aircraft and wind turbines

    NARCIS (Netherlands)

    Oerlemans, Stefan

    2009-01-01

    This thesis deals with the detection of aeroacoustic sound sources on aircraft and wind turbines using phased microphone arrays. First, the reliability of the array technique is assessed using airframe noise measurements in open and closed wind tunnels. It is demonstrated that quantitative acoustic

  19. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  20. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  1. Quantitative methods for structural characterization of proteins based on deep UV resonance Raman spectroscopy.

    Science.gov (United States)

    Shashilov, Victor A; Sikirzhytski, Vitali; Popova, Ludmila A; Lednev, Igor K

    2010-09-01

    Here we report on novel quantitative approaches for protein structural characterization using deep UV resonance Raman (DUVRR) spectroscopy. Specifically, we propose a new method combining hydrogen-deuterium (HD) exchange and Bayesian source separation for extracting the DUVRR signatures of various structural elements of aggregated proteins including the cross-beta core and unordered parts of amyloid fibrils. The proposed method is demonstrated using the set of DUVRR spectra of hen egg white lysozyme acquired at various stages of HD exchange. Prior information about the concentration matrix and the spectral features of the individual components was incorporated into the Bayesian equation to eliminate the ill-conditioning of the problem caused by 100% correlation of the concentration profiles of protonated and deuterated species. Secondary structure fractions obtained by partial least squares (PLS) and least squares support vector machines (LS-SVMs) were used as the initial guess for the Bayessian source separation. Advantages of the PLS and LS-SVMs methods over the classical least squares calibration (CLSC) are discussed and illustrated using the DUVRR data of the prion protein in its native and aggregated forms. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  2. A quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sex among HIV-seropositive men.

    Science.gov (United States)

    Gerbi, Gemechu B; Habtemariam, Tsegaye; Tameru, Berhanu; Nganwa, David; Robnett, Vinaida

    2012-01-01

    The objective of this study is to conduct a quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sexual practices among HIV-seropositive men. A knowledgebase was developed by reviewing different published sources. The data were collected from different sources including Centers for Disease Control and Prevention, selected journals, and reports. The risk pathway scenario tree was developed based on a comprehensive review of published literature. The variables are organized into nine major parameter categories. Monte Carlo simulations for the quantitative risk assessment of HIV/AIDS transmission was executed with the software @Risk 4.0 (Palisade Corporation). Results show that the value for the likelihood of unprotected sex due to having less knowledge about HIV/AIDS and negative attitude toward condom use and safer sex ranged from 1.24 × 10(-5) to 8.47 × 10(-4) with the mean and standard deviation of 1.83 × 10(-4) and 8.63 × 10(-5), respectively. The likelihood of unprotected sex due to having greater anger-hostility, anxiety, less satisfied with aspects of life, and greater depressive symptoms ranged from 2.76 × 10(-9) to 5.34 × 10(-7) with the mean and standard deviation of 5.23 × 10(-8) and 3.58 × 10(-8), respectively. The findings suggest that HIV/AIDS research and intervention programs must be focused on behavior, and the broader setting within which individual risky behaviors occur.

  3. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  4. Global anthropogenic emissions of particulate matter including black carbon

    Science.gov (United States)

    Klimont, Zbigniew; Kupiainen, Kaarle; Heyes, Chris; Purohit, Pallav; Cofala, Janusz; Rafaj, Peter; Borken-Kleefeld, Jens; Schöpp, Wolfgang

    2017-07-01

    This paper presents a comprehensive assessment of historical (1990-2010) global anthropogenic particulate matter (PM) emissions including the consistent and harmonized calculation of mass-based size distribution (PM1, PM2. 5, PM10), as well as primary carbonaceous aerosols including black carbon (BC) and organic carbon (OC). The estimates were developed with the integrated assessment model GAINS, where source- and region-specific technology characteristics are explicitly included. This assessment includes a number of previously unaccounted or often misallocated emission sources, i.e. kerosene lamps, gas flaring, diesel generators, refuse burning; some of them were reported in the past for selected regions or in the context of a particular pollutant or sector but not included as part of a total estimate. Spatially, emissions were calculated for 172 source regions (as well as international shipping), presented for 25 global regions, and allocated to 0.5° × 0.5° longitude-latitude grids. No independent estimates of emissions from forest fires and savannah burning are provided and neither windblown dust nor unpaved roads emissions are included. We estimate that global emissions of PM have not changed significantly between 1990 and 2010, showing a strong decoupling from the global increase in energy consumption and, consequently, CO2 emissions, but there are significantly different regional trends, with a particularly strong increase in East Asia and Africa and a strong decline in Europe, North America, and the Pacific region. This in turn resulted in important changes in the spatial pattern of PM burden, e.g. European, North American, and Pacific contributions to global emissions dropped from nearly 30 % in 1990 to well below 15 % in 2010, while Asia's contribution grew from just over 50 % to nearly two-thirds of the global total in 2010. For all PM species considered, Asian sources represented over 60 % of the global anthropogenic total, and residential combustion

  5. Global anthropogenic emissions of particulate matter including black carbon

    Directory of Open Access Journals (Sweden)

    Z. Klimont

    2017-07-01

    Full Text Available This paper presents a comprehensive assessment of historical (1990–2010 global anthropogenic particulate matter (PM emissions including the consistent and harmonized calculation of mass-based size distribution (PM1, PM2. 5, PM10, as well as primary carbonaceous aerosols including black carbon (BC and organic carbon (OC. The estimates were developed with the integrated assessment model GAINS, where source- and region-specific technology characteristics are explicitly included. This assessment includes a number of previously unaccounted or often misallocated emission sources, i.e. kerosene lamps, gas flaring, diesel generators, refuse burning; some of them were reported in the past for selected regions or in the context of a particular pollutant or sector but not included as part of a total estimate. Spatially, emissions were calculated for 172 source regions (as well as international shipping, presented for 25 global regions, and allocated to 0.5°  ×  0.5° longitude–latitude grids. No independent estimates of emissions from forest fires and savannah burning are provided and neither windblown dust nor unpaved roads emissions are included. We estimate that global emissions of PM have not changed significantly between 1990 and 2010, showing a strong decoupling from the global increase in energy consumption and, consequently, CO2 emissions, but there are significantly different regional trends, with a particularly strong increase in East Asia and Africa and a strong decline in Europe, North America, and the Pacific region. This in turn resulted in important changes in the spatial pattern of PM burden, e.g. European, North American, and Pacific contributions to global emissions dropped from nearly 30 % in 1990 to well below 15 % in 2010, while Asia's contribution grew from just over 50 % to nearly two-thirds of the global total in 2010. For all PM species considered, Asian sources represented over 60 % of the global

  6. Arguments and sources on Italian online forums on childhood vaccinations: Results of a content analysis.

    Science.gov (United States)

    Fadda, Marta; Allam, Ahmed; Schulz, Peter J

    2015-12-16

    Despite being committed to the immunization agenda set by the WHO, Italy is currently experiencing decreasing vaccination rates and increasing incidence of vaccine-preventable diseases. Our aim is to analyze Italian online debates on pediatric immunizations through a content analytic approach in order to quantitatively evaluate and summarize users' arguments and information sources. Threads were extracted from 3 Italian forums. Threads had to include the keyword Vaccin* in the title, focus on childhood vaccination, and include at least 10 posts. They had to have been started between 2008 and June 2014. High inter-coder reliability was achieved. Exploratory analysis using k-means clustering was performed to identify users' posting patterns for arguments about vaccines and sources. The analysis included 6544 posts mentioning 6223 arguments about pediatric vaccinations and citing 4067 sources. The analysis of argument posting patterns included users who published a sufficient number of posts; they generated 85% of all arguments on the forum. Dominating patterns of three groups were identified: (1) an anti-vaccination group (n=280) posted arguments against vaccinations, (2) a general pro-vaccination group (n=222) posted substantially diverse arguments supporting vaccination and (3) a safety-focused pro-vaccination group (n=158) mainly forwarded arguments that questioned the negative side effects of vaccination. The anti-vaccination group was shown to be more active than the others. They use multiple sources, own experience and media as their cited sources of information. Medical professionals were among the cited sources of all three groups, suggesting that vaccination-adverse professionals are gaining attention. Knowing which information is shared online on the topic of pediatric vaccinations could shed light on why immunization rates have been decreasing and what strategies would be best suited to address parental concerns. This suggests there is a high need for

  7. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif

    2014-12-11

    Various examples are provided for land streamer seismic surveying using multiple sources. In one example, among others, a method includes disposing a land streamer in-line with first and second shot sources. The first shot source is at a first source location adjacent to a proximal end of the land streamer and the second shot source is at a second source location separated by a fixed length corresponding to a length of the land streamer. Shot gathers can be obtained when the shot sources are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first shot source. The second shot source is separated from the first shot source by a fixed overall length corresponding to the land streamer.

  8. Quantum key distribution with an unknown and untrusted source

    Science.gov (United States)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  9. Occurance of Staphylococcus nepalensis strains in different sources including human clinical material.

    Science.gov (United States)

    Nováková, Dana; Pantůcek, Roman; Petrás, Petr; Koukalová, Dagmar; Sedlácek, Ivo

    2006-10-01

    Five isolates of coagulase-negative staphylococci were obtained from human urine, the gastrointestinal tract of squirrel monkeys, pig skin and from the environment. All key biochemical characteristics of the tested strains corresponded with the description of Staphylococcus xylosus species. However, partial 16S rRNA gene sequences obtained from analysed strains corresponded with those of Staphylococcus nepalensis reference strains, except for two strains which differed in one residue. Ribotyping with EcoRI and HindIII restriction enzymes, whole cell protein profile analysis performed by SDS-PAGE and SmaI macrorestriction analysis were used for more precise characterization and identification of the analysed strains. Obtained results showed that EcoRI and HindIII ribotyping and whole cell protein fingerprinting are suitable and reliable methods for the differentiation of S. nepalensis strains from the other novobiocin resistant staphylococci, whereas macrorestriction analysis was found to be a good tool for strain typing. The isolation of S. nepalensis is sporadic, and according to our best knowledge this study is the first report of the occurrence of this species in human clinical material as well as in other sources.

  10. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  11. Quantitative analysis of transmittance and photoluminescence using a low cost apparatus

    International Nuclear Information System (INIS)

    Onorato, P; Malgieri, M; De Ambrosis, A

    2016-01-01

    We show how a low cost spectrometer, based on the use of inexpensive diffraction transmission gratings coupled with a commercial digital photo camera or a cellphone, can be assembled and employed to obtain quantitative spectra of different sources. In particular, we discuss its use in studying the spectra of fluorescent colored ink, used in highlighting pens, for which the transmission band and the emission peaks are measured and related to the ink color. (paper)

  12. Quantitative analysis of transmittance and photoluminescence using a low cost apparatus

    Science.gov (United States)

    Onorato, P.; Malgieri, M.; De Ambrosis, A.

    2016-01-01

    We show how a low cost spectrometer, based on the use of inexpensive diffraction transmission gratings coupled with a commercial digital photo camera or a cellphone, can be assembled and employed to obtain quantitative spectra of different sources. In particular, we discuss its use in studying the spectra of fluorescent colored ink, used in highlighting pens, for which the transmission band and the emission peaks are measured and related to the ink color.

  13. Characterization and source identification of pollutants in runoff from a mixed land use watershed using ordination analyses.

    Science.gov (United States)

    Lee, Dong Hoon; Kim, Jin Hwi; Mendoza, Joseph A; Lee, Chang Hee; Kang, Joo-Hyon

    2016-05-01

    While identification of critical pollutant sources is the key initial step for cost-effective runoff management, it is challenging due to the highly uncertain nature of runoff pollution, especially during a storm event. To identify critical sources and their quantitative contributions to runoff pollution (especially focusing on phosphorous), two ordination methods were used in this study: principal component analysis (PCA) and positive matrix factorization (PMF). For the ordination analyses, we used runoff quality data for 14 storm events, including data for phosphorus, 11 heavy metal species, and eight ionic species measured at the outlets of subcatchments with different land use compositions in a mixed land use watershed. Five factors as sources of runoff pollutants were identified by PCA: agrochemicals, groundwater, native soils, domestic sewage, and urban sources (building materials and automotive activities). PMF identified similar factors to those identified by PCA, with more detailed source mechanisms for groundwater (i.e., nitrate leaching and cation exchange) and urban sources (vehicle components/motor oils/building materials and vehicle exhausts), confirming the sources identified by PCA. PMF was further used to quantify contributions of the identified sources to the water quality. Based on the results, agrochemicals and automotive activities were the two dominant and ubiquitous phosphorus sources (39-61 and 16-47 %, respectively) in the study area, regardless of land use types.

  14. Quantitative label-free proteomics for discovery of biomarkers in cerebrospinal fluid: assessment of technical and inter-individual variation.

    Directory of Open Access Journals (Sweden)

    Richard J Perrin

    Full Text Available Biomarkers are required for pre-symptomatic diagnosis, treatment, and monitoring of neurodegenerative diseases such as Alzheimer's disease. Cerebrospinal fluid (CSF is a favored source because its proteome reflects the composition of the brain. Ideal biomarkers have low technical and inter-individual variability (subject variance among control subjects to minimize overlaps between clinical groups. This study evaluates a process of multi-affinity fractionation (MAF and quantitative label-free liquid chromatography tandem mass spectrometry (LC-MS/MS for CSF biomarker discovery by (1 identifying reparable sources of technical variability, (2 assessing subject variance and residual technical variability for numerous CSF proteins, and (3 testing its ability to segregate samples on the basis of desired biomarker characteristics.Fourteen aliquots of pooled CSF and two aliquots from six cognitively normal individuals were randomized, enriched for low-abundance proteins by MAF, digested endoproteolytically, randomized again, and analyzed by nano-LC-MS. Nano-LC-MS data were time and m/z aligned across samples for relative peptide quantification. Among 11,433 aligned charge groups, 1360 relatively abundant ones were annotated by MS2, yielding 823 unique peptides. Analyses, including Pearson correlations of annotated LC-MS ion chromatograms, performed for all pairwise sample comparisons, identified several sources of technical variability: i incomplete MAF and keratins; ii globally- or segmentally-decreased ion current in isolated LC-MS analyses; and iii oxidized methionine-containing peptides. Exclusion of these sources yielded 609 peptides representing 81 proteins. Most of these proteins showed very low coefficients of variation (CV<5% whether they were quantified from the mean of all or only the 2 most-abundant peptides. Unsupervised clustering, using only 24 proteins selected for high subject variance, yielded perfect segregation of pooled and

  15. Sources of groundwater contamination

    International Nuclear Information System (INIS)

    Assaf, H.; Al-Masri, M. S.

    2007-09-01

    In spite of the importance of water for life, either for drinking, irrigation, industry or other wide uses in many fields, human beings seem to contaminate it and make it unsuitable for human uses. This is due to disposal of wastes in the environment without treatment. In addition to population increase and building expanding higher living costs, industrial and economical in growth that causes an increase in water consumption. All of these factors have made an increase pressure on our water environment quantitatively and qualitatively. In addition, there is an increase of potential risks to the water environmental due to disposal of domestic and industrial wastewater in areas near the water sources. Moreover, the use of unacceptable irrigation systems may increase soil salinity and evaporation rates. The present report discusses the some groundwater sources and problem, hot and mineral waters that become very important in our life and to our health due to its chemical and radioactivity characteristics.(authors)

  16. A Quantitative Methodology for Vetting Dark Network Intelligence Sources for Social Network Analysis

    Science.gov (United States)

    2012-06-01

    Figure V-7 Source Stress Contributions for the Example ............................................ V-24  Figure V-8 ROC Curve for the Example...resilience is the ability of the organization “to avoid disintegration when coming under stress (Milward & Raab, 2006, p. 351).” Despite numerous...members of the network. Examples such as subordinates directed to meetings in place of their superiors, virtual participation via telecommuting

  17. Quantitative fluorescence angiography for neurosurgical interventions.

    Science.gov (United States)

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  18. Quantitative Assays for RAS Pathway Proteins and Phosphorylation States

    Science.gov (United States)

    The NCI CPTAC program is applying its expertise in quantitative proteomics to develop assays for RAS pathway proteins. Targets include key phosphopeptides that should increase our understanding of how the RAS pathway is regulated.

  19. Developing a source-receptor methodology for the characterization of VOC sources in ambient air

    International Nuclear Information System (INIS)

    Borbon, A.; Badol, C.; Locoge, N.

    2005-01-01

    Since 2001, in France, a continuous monitoring of about thirty ozone precursor non-methane hydrocarbons (NMHC) is led in some urban areas. The automated system for NMHC monitoring consists of sub-ambient preconcentration on a cooled multi-sorbent trap followed by thermal desorption and bidimensional Gas Chromatography/Flame Ionisation Detection analysis.The great number of data collected and their exploitation should provide a qualitative and quantitative assessment of hydrocarbon sources. This should help in the definition of relevant strategies of emission regulation as stated by the European Directive relative to ozone in ambient air (2002/3/EC). The purpose of this work is to present the bases and the contributions of an original methodology known as source-receptor in the characterization of NMHC sources. It is a statistical and diagnostic approach, adaptable and transposable in all urban sites, which integrates the spatial and temporal dynamics of the emissions. The methods for source identification combine descriptive or more complex complementary approaches: 1) univariate approach through the analysis of NMHC time series and concentration roses, 2) bivariate approach through a Graphical Ratio Analysis and a characterization of scatterplot distributions of hydrocarbon pairs, 3) multivariate approach with Principal Component Analyses on various time basis. A linear regression model is finally developed to estimate the spatial and temporal source contributions. Apart from vehicle exhaust emissions, sources of interest are: combustion and fossil fuel-related activities, petrol and/or solvent evaporation, the double anthropogenic and biogenic origin of isoprene and other industrial activities depending on local parameters. (author)

  20. Arsenic levels in groundwater aquifer of the Neoplanta source area ...

    African Journals Online (AJOL)

    As part of a survey on the groundwater aquifer at the Neoplanta source site, standard laboratory analysis of water quality and an electromagnetic geophysical method were used for long-term quantitative and qualitative monitoring of arsenic levels. This study presents only the results of research conducted in the ...

  1. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  2. Absolute quantitative autoradiography of low concentrations of [125I]-labeled proteins in arterial tissue

    International Nuclear Information System (INIS)

    Schnitzer, J.J.; Morrel, E.M.; Colton, C.K.; Smith, K.A.; Stemerman, M.B.

    1987-01-01

    We developed a method for absolute quantitative autoradiographic measurement of very low concentrations of [ 125 I]-labeled proteins in arterial tissue using Kodak NTB-2 nuclear emulsion. A precise linear relationship between measured silver grain density and isotope concentration was obtained with uniformly labeled standard sources composed of epoxy-embedded gelatin containing glutaraldehyde-fixed [ 125 I]-albumin. For up to 308-day exposures of 1 micron-thick tissue sections, background grain densities ranged from about two to eight grains/1000 micron 2, and the technique was sensitive to as little as about one grain/1000 micron 2 above background, which correspond to a radioactivity concentration of about 2 x 10(4) cpm/ml. A detailed statistical analysis of variability was performed and the sum of all sources of variation quantified. The half distance for spatial resolution was 1.7 micron. Both visual and automated techniques were employed for quantitative grain density analysis. The method was illustrated by measurement of in vivo transmural [ 125 I]-low-density lipoprotein [( 125 I]-LDL) concentration profiles in de-endothelialized rabbit thoracic aortic wall

  3. Sources of polarized neutrons

    International Nuclear Information System (INIS)

    Walter, L.

    1983-01-01

    Various sources of polarized neutrons are reviewed. Monoenergetic source produced with unpolarized or polarized beams, white sources of polarized neutrons, production by transmissions through polarized hydrogen targets and polarized thermal neutronsare discussed, with appropriate applications included. (U.K.)

  4. Quantitative comparison of genotoxic (mutagenic and carcinogenic) risks and the choice of energy sources

    International Nuclear Information System (INIS)

    Latarjet, R.

    1983-01-01

    For 25 years, pollution for radiation has been governed by restrictive rules enacted and periodically revised by an international commission, and adopted by all countries. Nothing similar exists for mutagenic and carcinogenic chemicals. Since these substances affect the genetic material in the cells with reactions often similar to those caused by radiation, quantitative comparisons are possible, in particular for some of those compounds produced by the combustion of coal, oil and gaz. This paper describes the main results obtained at the Institut Curie, since 1975, with ethylene, ethylene oxide and vinyl chloride monomer. The consequences are discussed for: a) the establishement of control rules for the main genotoxic chemical pollutions; b) the assessment of long term risks in the cases of nuclear energy and of the energies obtained by combustion [fr

  5. A Hybrid, Current-Source/Voltage-Source Power Inverter Circuit

    DEFF Research Database (Denmark)

    Trzynadlowski, Andrzej M.; Patriciu, Niculina; Blaabjerg, Frede

    2001-01-01

    A combination of a large current-source inverter and a small voltage-source inverter circuits is analyzed. The resultant hybrid inverter inherits certain operating advantages from both the constituent converters. In comparison with the popular voltage-source inverter, these advantages include...... reduced switching losses, improved quality of output current waveforms, and faster dynamic response to current control commands. Description of operating principles and characteristics of the hybrid inverter is illustrated with results of experimental investigation of a laboratory model....

  6. Quantitative SPECT brain imaging: Effects of attenuation and detector response

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Bowsher, J.E.; Turkington, T.G.; Liang, Z.; Greer, K.L.; Coleman, R.E.

    1993-01-01

    Two physical factors that substantially degrade quantitative accuracy in SPECT imaging of the brain are attenuation and detector response. In addition to the physical factors, random noise in the reconstructed image can greatly affect the quantitative measurement. The purpose of this work was to implement two reconstruction methods that compensate for attenuation and detector response, a 3D maximum likelihood-EM method (ML) and a filtered backprojection method (FB) with Metz filter and Chang attenuation compensation, and compare the methods in terms of quantitative accuracy and image noise. The methods were tested on simulated data of the 3D Hoffman brain phantom. The simulation incorporated attenuation and distance-dependent detector response. Bias and standard deviation of reconstructed voxel intensities were measured in the gray and white matter regions. The results with ML showed that in both the gray and white matter regions as the number of iterations increased, bias decreased and standard deviation increased. Similar results were observed with FB as the Metz filter power increased. In both regions, ML had smaller standard deviation than FB for a given bias. Reconstruction times for the ML method have been greatly reduced through efficient coding, limited source support, and by computing attenuation factors only along rays perpendicular to the detector

  7. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    International Nuclear Information System (INIS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J

    2008-01-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle

  8. Radiation Sources Working Group Summary Report

    International Nuclear Information System (INIS)

    Fazio, Michael V.

    1999-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, component technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigarion, and phenomena that impact source design such as fatigue in resonant structures due to pulsed RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations

  9. Radiation Sources Working Group Summary Report

    International Nuclear Information System (INIS)

    Fazio, M.V.

    1999-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, component technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigarion, and phenomena that impact source design such as fatigue in resonant structures due to pulsed RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations. copyright 1999 American Institute of Physics

  10. Kernel integration scatter model for parallel beam gamma camera and SPECT point source response

    International Nuclear Information System (INIS)

    Marinkovic, P.M.

    2001-01-01

    Scatter correction is a prerequisite for quantitative single photon emission computed tomography (SPECT). In this paper a kernel integration scatter Scatter correction is a prerequisite for quantitative SPECT. In this paper a kernel integration scatter model for parallel beam gamma camera and SPECT point source response based on Klein-Nishina formula is proposed. This method models primary photon distribution as well as first Compton scattering. It also includes a correction for multiple scattering by applying a point isotropic single medium buildup factor for the path segment between the point of scatter an the point of detection. Gamma ray attenuation in the object of imaging, based on known μ-map distribution, is considered too. Intrinsic spatial resolution of the camera is approximated by a simple Gaussian function. Collimator is modeled simply using acceptance angles derived from the physical dimensions of the collimator. Any gamma rays satisfying this angle were passed through the collimator to the crystal. Septal penetration and scatter in the collimator were not included in the model. The method was validated by comparison with Monte Carlo MCNP-4a numerical phantom simulation and excellent results were obtained. The physical phantom experiments, to confirm this method, are planed to be done. (author)

  11. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Quantitative description of woody plant communities: Part II ...

    African Journals Online (AJOL)

    These procedures are divided into primary and secondary calculations. The former is then divided into the calculation of spatial tree volume and preliminary calculations regarding the complete quantitative description. The latter include the calculation of the evapotranspiration tree equivalent (ETTE), browse tree equivalent ...

  13. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  14. Advanced neutron source project information management. A model for the future

    International Nuclear Information System (INIS)

    King-Jones, K.; Cleaves, J.

    1995-01-01

    The Advanced Neutron Source (ANS) is a proposed new research facility that will provide steady-state beams of neutrons for experiments by more than 1000 researchers per year in the fields of materials science and engineering, biology, chemistry, materials analysis, and nuclear science. The facility will also include irradiation capabilities to produce radioisotopes for medical applications, research, industry, and materials testing. This paper discusses the architecture and data flow used by the project, some quantitative examinations of potential cost savings and return on investment and software applications used to generate and manage data across IBM-compatible personal computers, Macintosh, and Unix-based workstations. Personnel management aspects addressed include providing paper copy to users only when needed for adequate technical review, using graded approaches to providing support for numerous user-needed software applications, and implementing a phased approach to compliance with computer-aided acquisition and logistic support (CALS) standards that allows sufficient user flexibility for performing technical tasks while providing needed data sharing and integration

  15. Advanced Neutron Source project information management: A model for the future

    International Nuclear Information System (INIS)

    King-Jones, K.; Cleaves, J.

    1995-01-01

    The Advanced Neutron Source (ANS) is a proposed new research facility that will provide steady-state beams of neutrons for experiments by more than 1,000 researchers per year in the fields of materials science and engineering, biology, chemistry, materials analysis, and nuclear science. The facility will also include irradiation capabilities to produce radioisotopes for medical applications, research, industry, and materials testing. This paper discusses the architecture and data flow used by the project, some quantitative examinations of potential cost savings and return on investment, and software applications used to generate and manage data across IBM-compatible personal computers, Macintosh, and Unix-based workstations. Personnel management aspects addressed include providing paper copy to users only when needed for adequate technical review, using graded approaches to providing support for numerous user-needed software applications, and implementing a phased approach to compliance with computer-aided acquisition and logistic support (CALS) standards that allows sufficient user flexibility for performing technical tasks while providing needed data sharing and integration

  16. Photoionization mass spectrometer for studies of flame chemistry with a synchrotron light source

    International Nuclear Information System (INIS)

    Cool, Terrill A.; McIlroy, Andrew; Qi, Fei; Westmoreland, Phillip R.; Poisson, Lionel; Peterka, Darcy S.; Ahmed, Musahid

    2005-01-01

    A flame-sampling molecular-beam photoionization mass spectrometer, recently designed and constructed for use with a synchrotron-radiation light source, provides significant improvements over previous molecular-beam mass spectrometers that have employed either electron-impact ionization or vacuum ultraviolet laser photoionization. These include superior signal-to-noise ratio, soft ionization, and photon energies easily and precisely tunable [E/ΔE(FWHM)≅250-400] over the 7.8-17-eV range required for quantitative measurements of the concentrations and isomeric compositions of flame species. Mass resolution of the time-of-flight mass spectrometer is m/Δm=400 and sensitivity reaches ppm levels. The design of the instrument and its advantages for studies of flame chemistry are discussed

  17. Testicular dysgenesis syndrome and the estrogen hypothesis: a quantitative meta-analysis.

    Science.gov (United States)

    Martin, Olwenn V; Shialis, Tassos; Lester, John N; Scrimshaw, Mark D; Boobis, Alan R; Voulvoulis, Nikolaos

    2008-02-01

    Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-alpha-mediated mode of action was specifically explored. We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population.

  18. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    Science.gov (United States)

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  20. Quantitative radiomics studies for tissue characterization: a review of technology and methodological procedures.

    Science.gov (United States)

    Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter

    2017-02-01

    Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.

  1. Source-space ICA for MEG source imaging.

    Science.gov (United States)

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  2. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    Science.gov (United States)

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  3. New perspectives from new generations of neutron sources

    International Nuclear Information System (INIS)

    Mezei, F.

    2007-01-01

    Since the early fifties the vital multidisciplinary progress in understanding condensed matter is, in a substantial fraction, based on results of neutron scattering experiments. Neutron scattering is an inherently intensity limited method and after 50 years of considerable advance - primarily achieved by improving the scattering instruments - the maturation of the technique of pulsed spallation sources now opens up the way to provide more neutrons with improved cost and energy efficiency. A quantitative analysis of the figure-of-merit of the specialized instruments for pulsed source operation shows that up to 2 orders of magnitude intensity gains can be achieved in the next decade, with the advent of high power spallation sources. The first stations on this road, the MW class short pulse spallation sources SNS in the Usa (under commissioning), and J-PARC in Japan (under construction) will be followed by the 5 MW long pulse European Spallation Source (ESS). Further progress, that can be envisaged on the longer term, could amount to as much as another factor of 10 improvement. (author)

  4. Identifiability and Identification of Trace Continuous Pollutant Source

    Directory of Open Access Journals (Sweden)

    Hongquan Qu

    2014-01-01

    Full Text Available Accidental pollution events often threaten people’s health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.

  5. New perspectives from new generations of neutron sources

    Science.gov (United States)

    Mezei, Ferenc

    2007-09-01

    Since the early 1950s the vital multidisciplinary progress in understanding condensed matter is, in a substantial fraction, based on results of neutron scattering experiments. Neutron scattering is an inherently intensity limited method and after 50 years of considerable advance—primarily achieved by improving the scattering instruments—the maturation of the technique of pulsed spallation sources now opens up the way to provide more neutrons with improved cost and energy efficiency. A quantitative analysis of the figure-of-merit of the specialized instruments for pulsed source operation shows that up to 2 orders of magnitude intensity gains can be achieved in the next decade, with the advent of high power spallation sources. The first stations on this road, the MW class short pulse spallation sources SNS in the USA (under commissioning), and J-PARC in Japan (under construction) will be followed by the 5 MW long pulse European Spallation Source (ESS). Further progress, that can be envisaged on the longer term, could amount to as much as another factor of 10 improvement. To cite this article: F. Mezei, C. R. Physique 8 (2007).

  6. Environmental Radioactive Pollution Sources and Effects on Man

    International Nuclear Information System (INIS)

    El-Naggar, A.M.

    1999-01-01

    The sources of environmental radioactivity are essentially the naturally occurring radionuclides in the earth,s crust and the cosmogenic radionuclides reaching the environmental ecosystems. The other sources of environmental radioactivity are the man made sources which result from the radioactive materials in human life. The naturally occurring environmental radioactivity is an integral component of the terrestrial and extraterrestrial creation, and therefore it is not considered a source of radioactive pollution to the environment. The radioactive waste from human activities is released into the environment, and its radionuclide content becomes incorporated into the different ecosystems. This results in a situation of environmental radioactive pollution. This review presents the main features of environmental radioactive pollution, the radionuclide behaviour in the ecosystems, pathway models of radionuclides in the body and the probability of associated health hazards. The dose effect relationship of internal radiation exposure and its quantitative aspects are considered because of their relevance to this subject

  7. Radiation sources and technical services

    International Nuclear Information System (INIS)

    Stonek, K.; Satorie, Z.; Vyskocil, I.

    1981-01-01

    Work is briefly described of the department for sealed sources production of the Institute, including leak testing and surface contamination of sealed sources. The department also provides technical services including the inspections of sealed sources used in medicine and geology and repair of damaged sources. It carries out research of the mechanical and thermal strength of sealed sources and of the possibility of reprocessing used 226 Ra sources. The despatch department is responsible for supplying the entire country with home and imported radionuclides. The department of technical services is responsible for testing imported radionuclides, assembling materials testing, industrial and medical irradiation devices, and for the collection and storage of low-level wastes on a national scale. (M.D.)

  8. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  9. Assessment of a synchrotron X-ray method for quantitative analysis of calcium hydroxide

    International Nuclear Information System (INIS)

    Williams, P. Jason; Biernacki, Joseph J.; Bai Jianming; Rawn, Claudia J.

    2003-01-01

    Thermogravimetric analysis (TGA) and quantitative X-ray diffraction (QXRD) are widely used to determine the calcium hydroxide (CH) content in cementitious systems containing blends of Portland cement, fly ash, blast furnace slag, silica fume and other pozzolanic and hydraulic materials. These techniques, however, are destructive to cement samples and subject to various forms of error. While precise weight losses can be measured by TGA, extracting information from samples with multiple overlapping thermal events is difficult. And, however, while QXRD can offer easier deconvolution, the accuracy for components below about 5 wt.% is typically poor when a laboratory X-ray source is used. Furthermore, the destructive nature of both techniques prevents using them to study the in situ hydration of a single contiguous sample for kinetic analysis. In an attempt to overcome these problems, the present research evaluated the use of synchrotron X-rays for quantitative analysis of CH. A synchrotron X-ray source was used to develop calibration data for quantification of the amount of CH in mixtures with fly ash. These data were compared to conventional laboratory XRD data for like samples. While both methods were found to offer good quantification, synchrotron XRD (SXRD) provided a broader range of detectability and higher accuracy than laboratory diffraction and removed the subjectivity as compared to TGA analysis. Further, the sealed glass capillaries used with the synchrotron source provided a nondestructive closed, in situ environment for tracking hydrating specimens from zero to any desired age

  10. Sources of negative tunneling magnetoresistance in multilevel quantum dots with ferromagnetic contacts

    DEFF Research Database (Denmark)

    Koller, Sonja; Grifoni, Milena; Paaske, Jens

    2012-01-01

    We analyze distinct sources of spin-dependent energy level shifts and their impact on the tunneling magnetoresistance (TMR) of interacting quantum dots coupled to collinearly polarized ferromagnetic leads. Level shifts due to virtual charge fluctuations can be quantitatively evaluated within...

  11. Application of californium-252 neutron sources for analytical chemistry

    International Nuclear Information System (INIS)

    Ishii, Daido

    1976-01-01

    The researches made for the application of Cf-252 neutron sources to analytical chemistry during the period from 1970 to 1974 including partly 1975 are reviewed. The first part is the introduction to the above. The second part deals with general review of symposia, publications and the like. Attention is directed to ERDA publishing the periodical ''Californium-252 Progress'' and to a study group of Cf-252 utilization held by Japanese Radioisotope Association in 1974. The third part deals with its application for radio activation analysis. The automated absolute activation analysis (AAAA) of Savannha River is briefly explained. The joint experiment of Savannha River operation office with New Brunswick laboratory is mentioned. Cf-252 radiation source was used for the non-destructive analysis of elements in river water. East neutrons of Cf-252 were used for the quantitative analysis of lead in paints. Many applications for industrial control processes have been reported. Attention is drawn to the application of Cf-252 neutron sources for the field search of neutral resources. For example, a logging sonde for searching uranium resources was developed. the fourth part deals with the application of the analysis with gamma ray by capturing neutrons. For example, a bore hole sonde and the process control analysis of sulfur in fuel utilized capture gamma ray. The prompt gamma ray by capturing neutrons may be used for the nondestructive analysis of enrivonment. (Iwakiri, K.)

  12. The Human Face of Health News: A Multi-Method Analysis of Sourcing Practices in Health-Related News in Belgian Magazines.

    Science.gov (United States)

    De Dobbelaer, Rebeca; Van Leuven, Sarah; Raeymaeckers, Karin

    2018-05-01

    Health journalists are central gatekeepers who select, frame, and communicate health news to a broad audience, but the selection and content of health news are also influenced by the sources journalists, rely on (Hinnant, Len-Rios, & Oh, 2012). In this paper, we examine whether the traditional elitist sourcing practices (e.g., research institutions, government) are still important in a digitalized news environment where bottom-up non-elite actors (e.g., patients, civil society organizations) can act as producers (Bruns, 2003). Our main goal, therefore, is to detect whether sourcing practices in health journalism can be linked with strategies of empowerment. We use a multi-method approach combining quantitative and qualitative research methods. First, two content analyses are developed to examine health-related news in Belgian magazines (popular weeklies, health magazines, general interest magazines, and women's magazines). The analyses highlight sourcing practices as visible in the texts and give an overview of the different stakeholders represented as sources. In the first wave, the content analysis includes 1047 health-related news items in 19 different Belgian magazines (March-June 2013). In the second wave, a smaller sample of 202 health-related items in 10 magazines was studied for follow-up reasons (February 2015). Second, to contextualize the findings of the quantitative analysis, we interviewed 16 health journalists and editors-in-chief. The results illustrate that journalists consider patients and blogs as relevant sources for health news; nonetheless, elitist sourcing practices still prevail at the cost of bottom-up communication. However, the in-depth interviews demonstrate that journalists increasingly consult patients and civil society actors to give health issues a more "human" face. Importantly, the study reveals that this strategy is differently applied by the various types of magazines. While popular weeklies and women's magazines give a voice to

  13. MR morphology of triangular fibrocartilage complex: correlation with quantitative MR and biomechanical properties

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Won C.; Chang, Eric Y.; Chung, Christine B. [VA San Diego Healthcare System, Radiology Service, San Diego, CA (United States); University of California-San Diego, Department of Radiology, San Diego, CA (United States); Ruangchaijatuporn, Thumanoon [Mahidol University, Department of Diagnostic and Therapeutic Radiology, Faculty of Medicine Ramathibodi Hospital, Rachathewi, Bangkok (Thailand); Biswas, Reni; Du, Jiang; Statum, Sheronda [University of California-San Diego, Department of Radiology, San Diego, CA (United States)

    2016-04-15

    To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high-resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Five cadaveric wrists (22-70 years) were imaged at 3 T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. (orig.)

  14. MR morphology of triangular fibrocartilage complex: correlation with quantitative MR and biomechanical properties

    International Nuclear Information System (INIS)

    Bae, Won C.; Chang, Eric Y.; Chung, Christine B.; Ruangchaijatuporn, Thumanoon; Biswas, Reni; Du, Jiang; Statum, Sheronda

    2016-01-01

    To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high-resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Five cadaveric wrists (22-70 years) were imaged at 3 T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. (orig.)

  15. MR morphology of triangular fibrocartilage complex: correlation with quantitative MR and biomechanical properties.

    Science.gov (United States)

    Bae, Won C; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda; Chung, Christine B

    2016-04-01

    To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high-resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Five cadaveric wrists (22-70 years) were imaged at 3 T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques.

  16. Digital intelligence sources transporter

    International Nuclear Information System (INIS)

    Zhang Zhen; Wang Renbo

    2011-01-01

    It presents from the collection of particle-ray counting, infrared data communication, real-time monitoring and alarming, GPRS and other issues start to realize the digital management of radioactive sources, complete the real-time monitoring of all aspects, include the storing of radioactive sources, transporting and using, framing intelligent radioactive sources transporter, as a result, achieving reliable security supervision of radioactive sources. (authors)

  17. Synthesis of quantitative and qualitative research: an example using Critical Interpretive Synthesis.

    Science.gov (United States)

    Flemming, Kate

    2010-01-01

    This paper is a report of a Critical Interpretive Synthesis to synthesize quantitative research, in the form of an effectiveness review and a guideline, with qualitative research to examine the use of morphine to treat cancer-related pain. Critical Interpretive Synthesis is a new method of reviewing, developed from meta-ethnography, which integrates systematic review methodology with a qualitative tradition of enquiry. It has not previously been used specifically to synthesize effectiveness and qualitative literature. Data sources. An existing systematic review of quantitative research and a guideline examining the effectiveness of oral morphine to treat cancer pain were identified. Electronic searches of Medline, CINAHL, Embase, PsychINFO, Health Management Information Consortium database and the Social Science Citation Index to identify qualitative research were carried out in May 2008. Qualitative research papers reporting on the use of morphine to treat cancer pain were identified. The findings of the effectiveness research were used as a framework to guide the translation of findings from qualitative research using an integrative grid. A secondary translation of findings from the qualitative research, not specifically mapped to the effectiveness literature, was guided by the framework. Nineteen qualitative papers were synthesized with the quantitative effectiveness literature, producing 14 synthetic constructs. These were developed into four synthesizing arguments which drew on patients', carers' and healthcare professionals' interpretations of the meaning and context of the use of morphine to treat cancer pain. Critical Interpretive Synthesis can be adapted to synthesize reviews of quantitative research into effectiveness with qualitative research and fits into an existing typology of approaches to synthesizing qualitative and quantitative research.

  18. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the

  19. Use of radioactive indicators for the quantitative determination of non-metall inclusions in steel

    International Nuclear Information System (INIS)

    Rewienska-Kosciuk, B.; Michalik, J.

    1979-01-01

    Methods of determining and investigating the sources of non-metal inclusions in steel are presented together with some results of radiometric investigations. The experience of several years of research in industries as well as profound studies of world literature were used as a basis for systematic and critical discussion of the methods used. Optimum methods have been chosen for the quantitative determination of oxide inclusions and for the identification of their origin (e.g. from the refractory furnace lining, the tap-hole, the runner, the ladle or mold slag). Problems of tracers (type, quantity, condition, activity), of the labelling method suitable for the various origins of inclusions, of sampling, of chemical processing of the material sampled, as well as of radiometric measuring techniques (including possible activation) are discussed. Finally, a method for the determination of inclusions resulting from the deoxidation of steel is briefly outlined. (author)

  20. Evaluation of environmental impact of air pollution sources

    Energy Technology Data Exchange (ETDEWEB)

    Holnicki, P. [Polish Academy of Science, Warsaw (Poland). Systems Research Inst.

    2004-10-15

    This paper addresses the problem of evaluation and comparison of environmental impact of emission sources in the case of a complex, multisource emission field. The analysis is based on the forecasts of a short-term, dynamic dispersion model. The aim is to get a quantitative evaluation of the contribution of the selected sources according to the predefined, environmental cost function. The approach utilizes the optimal control technique for distributed parameter systems. The adjoint equation, related to the main transport equation of the forecasting model, is applied to calculate the sensitivity of the cost function to the emission intensity of the specified sources. An example implementation of a regional-scale, multilayer dynamic model of SOx transport is discussed as the main forecasting tool. The test computations have been performed for a set of the major power plants in a selected industrial region of Poland.

  1. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  2. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  3. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Science.gov (United States)

    2013-02-11

    ... on the sources of L. monocytogenes contamination, the effects of individual manufacturing and/or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1182] Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

  4. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  5. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  6. Data for iTRAQ secretomic analysis of Aspergillus fumigatus in response to different carbon sources

    OpenAIRE

    Sunil S. Adav; Anita Ravindran; Siu Kwan Sze

    2015-01-01

    Here, we provide data related to the research article entitled ?Quantitative proteomics study of Aspergillus fumigatus secretome revealed deamidation of secretory enzymes? by Adav et al. (J. Proteomics (2015) [1]). Aspergillus sp. plays an important role in lignocellulosic biomass recycling. To explore biomass hydrolyzing enzymes of A. fumigatus, we profiled secretome under different carbon sources such as glucose, cellulose, xylan and starch by high throughput quantitative proteomics using i...

  7. Individual patient dosimetry using quantitative SPECT imaging

    International Nuclear Information System (INIS)

    Gonzalez, J.; Oliva, J.; Baum, R.; Fisher, S.

    2002-01-01

    An approach is described to provide individual patient dosimetry for routine clinical use. Accurate quantitative SPECT imaging was achieved using appropriate methods. The volume of interest (VOI) was defined semi-automatically using a fixed threshold value obtained from phantom studies. The calibration factor to convert the voxel counts from SPECT images into activity values was determine from calibrated point source using the same threshold value as in phantom studies. From selected radionuclide the dose within and outside a sphere of voxel dimension at different distances was computed through dose point-kernels to obtain a discrete absorbed dose kernel representation around the volume source with uniform activity distribution. The spatial activity distribution from SPECT imaging was convolved with this kernel representation using the discrete Fourier transform method to yield three-dimensional absorbed dose rate distribution. The accuracy of dose rates calculation was validated by software phantoms. The absorbed dose was determined by integration of the dose rate distribution for each volume of interest (VOI). Parameters for treatment optimization such as dose rate volume histograms and dose rate statistic are provided. A patient example was used to illustrate our dosimetric calculations

  8. Combinatorial hexapeptide ligand libraries (ProteoMiner): an innovative fractionation tool for differential quantitative clinical proteomics.

    Science.gov (United States)

    Hartwig, Sonja; Czibere, Akos; Kotzka, Jorg; Passlack, Waltraud; Haas, Rainer; Eckel, Jürgen; Lehr, Stefan

    2009-07-01

    Blood serum samples are the major source for clinical proteomics approaches, which aim to identify diagnostically relevant or treatment-response related proteins. But, the presence of very high-abundance proteins and the enormous dynamic range of protein distribution hinders whole serum analysis. An innovative tool to overcome these limitations, utilizes combinatorial hexapeptide ligand libraries (ProteoMiner). Here, we demonstrate that ProteoMiner can be used for comparative and quantitative analysis of complex proteomes. We spiked serum samples with increasing amounts (3 microg to 300 microg) of whole E. coli lysate, processed it with ProteoMiner and performed quantitative analyses of 2D-gels. We found, that the concentration of the spiked bacteria proteome, reflected by the maintained proportional spot intensities, was not altered by ProteoMiner treatment. Therefore, we conclude that the ProteoMiner technology can be used for quantitative analysis of low abundant proteins in complex biological samples.

  9. Parametric biomedical imaging - what defines the quality of quantitative radiological approaches?

    International Nuclear Information System (INIS)

    Glueer, C.C.; Barkmann, R.; Bolte, H.; Heller, M.; Hahn, H.K.; Dicken, V.; Majumdar, S.; Eckstein, F.; Nickelsen, T.N.

    2006-01-01

    Quantitative parametric imaging approaches provide new perspectives for radiological imaging. These include quantitative 2D, 3D, and 4D visualization options along with the parametric depiction of biological tissue properties and tissue function. This allows the interpretation of radiological data from a biochemical, biomechanical, or physiological perspective. Quantification permits the detection of small changes that are not yet visually apparent, thus allowing application in early disease diagnosis and monitoring therapy with enhanced sensitivity. This review outlines the potential of quantitative parametric imaging methods and demonstrates this on the basis of a few exemplary applications. One field of particular interest, the use of these methods for investigational new drug application studies, is presented. Assessment criteria for judging the quality of quantitative imaging approaches are discussed in the context of the potential and the limitations of these methods. While quantitative parametric imaging methods do not replace but rather supplement established visual interpretation methods in radiology, they do open up new perspectives for diagnosis and prognosis and in particular for monitoring disease progression and therapy. (orig.)

  10. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    Science.gov (United States)

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  12. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  13. Characterising Ageing in the Human Brainstem Using Quantitative Multimodal MRI Analysis

    Directory of Open Access Journals (Sweden)

    Christian eLambert

    2013-08-01

    Full Text Available Ageing is ubiquitous to the human condition. The MRI correlates of healthy ageing have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI and DTI. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analysing this region. By utilising a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of ageing within the human brainstem in vivo. Using quantitative MRI (qMRI, tensor based morphometry (TBM and voxel based quantification (VBQ, the volumetric and quantitative changes across healthy adults between 19-75 years were characterised. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetisation transfer (MT and increase in proton density (PD, accounting for the previously described midbrain shrinkage. Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterised, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterised by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases.

  14. On the Dichotomy of Qualitative and Quantitative Researches in Contemporary Scientific Methodology

    Directory of Open Access Journals (Sweden)

    U V Suvakovic

    2011-12-01

    Full Text Available Argumentation in favor of overcoming the long-ago-established dichotomy of qualitative and quantitative scientific research is presented in the article. Proceeding from the view of materialistic dialecticians that every scientific research must deal with a subject, the author assumes that it is impossible to conduct a quantitative research without first establishing the quality to be studied. This also concerns measuring, which is referred only to quantitative procedures in literature. By way of illustration, the author designs two instruments for measuring the successfulness of political parties - the scale and the quotient of party successfulness. On the other hand, even the qualitative analysis usually involves certain quantifications. The author concludes that to achieve methodological correctness the existing dichotomy of qualitative and quantitative research should be considered as overcome and a typology of scientific research including predominantly qualitative and predominantly quantitative studies, depending on the methodological components prevailing in them, should be used.

  15. 2011 NATA - Emissions Sources

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes all emissions sources that were modeled in the 2011 National Air Toxics Assessment (NATA), inlcluding point, nonpoint, and mobile sources, and...

  16. Quantitative computed tomography: emphysema and airway wall thickness by sex, age and smoking

    DEFF Research Database (Denmark)

    Grydeland, T B; Dirksen, A; Coxson, H O

    2009-01-01

    We investigated how quantitative high-resolution computed tomography (HRCT) measures of emphysema and airway wall thickness (AWT) vary with sex, age and smoking history. We included 463 chronic obstructive pulmonary disease (COPD) cases and 431 controls. All included subjects were current or ex...... cases, respectively, and 0.488+/-0.028 and 0.463+/-0.025 in male and female controls, respectively. AWT decreased with increasing age in cases, and increased with the degree of current smoking in all subjects. We found significant differences in quantitative HRCT measures of emphysema and AWT between...

  17. Theory for source-responsive and free-surface film modeling of unsaturated flow

    Science.gov (United States)

    Nimmo, J.R.

    2010-01-01

    A new model explicitly incorporates the possibility of rapid response, across significant distance, to substantial water input. It is useful for unsaturated flow processes that are not inherently diffusive, or that do not progress through a series of equilibrium states. The term source-responsive is used to mean that flow responds sensitively to changing conditions at the source of water input (e.g., rainfall, irrigation, or ponded infiltration). The domain of preferential flow can be conceptualized as laminar flow in free-surface films along the walls of pores. These films may be considered to have uniform thickness, as suggested by field evidence that preferential flow moves at an approximately uniform rate when generated by a continuous and ample water supply. An effective facial area per unit volume quantitatively characterizes the medium with respect to source-responsive flow. A flow-intensity factor dependent on conditions within the medium represents the amount of source-responsive flow at a given time and position. Laminar flow theory provides relations for the velocity and thickness of flowing source-responsive films. Combination with the Darcy-Buckingham law and the continuity equation leads to expressions for both fluxes and dynamic water contents. Where preferential flow is sometimes or always significant, the interactive combination of source-responsive and diffuse flow has the potential to improve prediction of unsaturated-zone fluxes in response to hydraulic inputs and the evolving distribution of soil moisture. Examples for which this approach is efficient and physically plausible include (i) rainstorm-generated rapid fluctuations of a deep water table and (ii) space- and time-dependent soil water content response to infiltration in a macroporous soil. ?? Soil Science Society of America.

  18. Determination of correction coefficients for quantitative analysis by mass spectrometry. Application to uranium impurities analysis; Recherche des coefficients de correction permettant l'analyse quantitative par spectrometrie de masse. Application a l'analyse d'impuretes dans l'uranium

    Energy Technology Data Exchange (ETDEWEB)

    Billon, J P [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1970-07-01

    Some of basic principles in spark source mass spectrometry are recalled. It is shown how this method can lead to quantitative analysis when attention is paid to some theoretical aspects. A time constant relation being assumed between the analysed solid sample and the ionic beam it gives we determined experimental relative sensitivity factors for impurities in uranium matrix. Results being in fairly good agreement with: an unelaborate theory on ionization yield in spark-source use of theoretically obtained relative sensitivity factors in uranium matrix has been developed. (author) [French] Apres avoir rappele quelques principes fondamentaux regissant la spectrometrie de masse a etincelles, nous avons montre que moyennant un certain nombre de precautions, il etait possible d'utiliser cette methode en analyse quantitative. Ayant admis qu'il existait une relation constante dans le temps entre l'echantillon solide analyse et le faisceau ionique qui en est issu, nous avons d'abord entrepris de determiner des coefficients de correction experimentaux pour des matrices d'uranium. Les premiers resultats pratiques semblant en accord avec une theorie simple relative au rendement d'ionisation dans la source a etincelles, nous avons etudie la possibilite d'appliquer directement les coefficients theoriques ainsi definis, l'application etant toujours faite sur des matrices d'uranium. (auteur)

  19. Impedance Source Power Electronic Converters

    DEFF Research Database (Denmark)

    Liu, Yushan; Abu-Rub, Haitham; Ge, Baoming

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable...... and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key...... features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding...

  20. An improved fast neutron radiography quantitative measurement method

    International Nuclear Information System (INIS)

    Matsubayashi, Masahito; Hibiki, Takashi; Mishima, Kaichiro; Yoshii, Koji; Okamoto, Koji

    2004-01-01

    The validity of a fast neutron radiography quantification method, the Σ-scaling method, which was originally proposed for thermal neutron radiography was examined with Monte Carlo calculations and experiments conducted at the YAYOI fast neutron source reactor. Water and copper were selected as comparative samples for a thermal neutron radiography case and a dense object, respectively. Although different characteristics on effective macroscopic cross-sections were implied by the simulation, the Σ-scaled experimental results with the fission neutron spectrum cross-sections were well fitted to the measurements for both the water and copper samples. This indicates that the Σ-scaling method could be successfully adopted for quantitative measurements in fast neutron radiography

  1. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  2. A human fecal contamination index for ranking impaired recreational watersusing the HF183 quantitative real-time PCR method

    Science.gov (United States)

    Human fecal pollution of surface water remains a public health concern worldwide. As a result, there is a growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for recreational water quality risk managem...

  3. Comprehensive quantitative comparison of the membrane proteome, phosphoproteome, and sialiome of human embryonic and neural stem cells

    DEFF Research Database (Denmark)

    Melo-Braga, Marcella Nunes; Schulz, Melanie; Liu, Qiuyue

    2014-01-01

    Human embryonic stem cells (hESCs) can differentiate into neural stem cells (NSCs), which can further be differentiated into neurons and glia cells. Therefore, these cells have huge potential as source for treatment of neurological diseases. Membrane-associated proteins are very important......ESCs and NSCs as well as to investigate potential new markers for these two cell stages, we performed large-scale quantitative membrane-proteomic of hESCs and NSCs. This approach employed membrane purification followed by peptide dimethyl labeling and peptide enrichment to study the membrane subproteome as well...... in which 78% of phosphopeptides were identified with ≥99% confidence in site assignment and 1810 unique formerly sialylated N-linked glycopeptides. Several proteins were identified as significantly regulated in hESCs and NSC, including proteins involved in the early embryonic and neural development...

  4. Radioactivity measurements of metallic 192Ir sources by calorimetric methods

    International Nuclear Information System (INIS)

    Genka, Tsuguo; Iwamoto, Seikichi; Takeuchi, Norio

    1992-01-01

    The necessity of establishing the traceability of dose measurement in brachytherapy 192 Ir sources is realized by physicians and researchers in the medical field. Standard sources of various shapes such as open-quotes hairpin,close quotes open-quotes single pin,close quotes open-quotes thin wire,close quotes and open-quotes seedclose quotes for calibrating ionization chambers in hospitals are being demanded. Nominal activities of not only these source products but also the standard sources have been so far specified by open-quotes apparentclose quotes values. Determination of open-quotes absoluteclose quotes activity by an established means such as 4pi-beta-gamma coincidence counting is not practical because quantitative dissolution of metallic iridium is very difficult. We tried to determine the open-quotes absoluteclose quotes activity by a calorimetric method in a fully nondestructive way

  5. Electric Power Monthly, August 1990. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    1990-11-29

    The Electric Power Monthly (EPM) presents monthly summaries of electric utility statistics at the national, Census division, and State level. The purpose of this publication is to provide energy decisionmakers with accurate and timely information that may be used in forming various perspectives on electric issues that lie ahead. Data includes generation by energy source (coal, oil, gas, hydroelectric, and nuclear); generation by region; consumption of fossil fuels for power generation; sales of electric power, cost data; and unusual occurrences. A glossary is included.

  6. Quantitative parameters to compare image quality of non-invasive coronary angiography with 16-slice, 64-slice and dual-source computed tomography

    International Nuclear Information System (INIS)

    Burgstahler, Christof; Reimann, Anja; Brodoefel, Harald; Tsiflikas, Ilias; Thomas, Christoph; Heuschmid, Martin; Daferner, Ulrike; Drosch, Tanja; Schroeder, Stephen; Herberts, Tina

    2009-01-01

    Multi-slice computed tomography (MSCT) is a non-invasive modality to visualize coronary arteries with an overall good image quality. Improved spatial and temporal resolution of 64-slice and dual-source computed tomography (DSCT) scanners are supposed to have a positive impact on diagnostic accuracy and image quality. However, quantitative parameters to compare image quality of 16-slice, 64-slice MSCT and DSCT are missing. A total of 256 CT examinations were evaluated (Siemens, Sensation 16: n=90; Siemens Sensation 64: n=91; Siemens Definition: n=75). Mean Hounsfield units (HU) were measured in the cavum of the left ventricle (LV), the ascending aorta (Ao), the left ventricular myocardium (My) and the proximal part of the left main (LM), the left anterior descending artery (LAD), the right coronary artery (RCA) and the circumflex artery (CX). Moreover, the ratio of intraluminal attenuation (HU) to myocardial attenuation was assessed for all coronary arteries. Clinical data [body mass index (BMI), gender, heart rate] were accessible for all patients. Mean attenuation (CA) of the coronary arteries was significantly higher for DSCT in comparison to 64- and 16-slice MSCT within the RCA [347±13 vs. 254±14 (64-MSCT) vs. 233±11 (16-MSCT) HU], LM (362±11/275 ± 12/262±9), LAD (332±17/248±19/219±14) and LCX (310±12/210±13/221±10, all p<0.05), whereas there was no significant difference between DSCT and 64-MSCT for the LV, the Ao and My. Heart rate had a significant impact on CA ratio in 16-slice and 64-slice CT only (p<0.05). BMI had no impact on the CA ratio in DSCT only (p<0.001). Improved spatial and temporal resolution of dual-source CT is associated with better opacification of the coronary arteries and a better contrast with the myocardium, which is independent of heart rate. In comparison to MSCT, opacification of the coronary arteries at DSCT is not affected by BMI. The main advantage of DSCT lies with the heart rate independency, which might have a

  7. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  8. Quantitative evaluation of emission controls on primary and secondary organic aerosol sources during Beijing 2008 Olympics

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-08-01

    Full Text Available To assess the primary and secondary sources of fine organic aerosols after the aggressive implementation of air pollution controls during the 2008 Beijing Olympic Games, 12 h PM2.5 values were measured at an urban site at Peking University (PKU and an upwind rural site at Yufa during the CAREBEIJING-2008 (Campaigns of Air quality REsearch in BEIJING and surrounding region summer field campaign. The average PM2.5 concentrations were 72.5 ± 43.6 μg m−3 and 64.3 ± 36.2 μg m−3 (average ± standard deviation, below as the same at PKU and Yufa, respectively, showing the lowest concentrations in recent years. Combining the results from a CMB (chemical mass balance model and secondary organic aerosol (SOA tracer-yield model, five primary and four secondary fine organic aerosol sources were compared with the results from previous studies in Beijing. The relative contribution of mobile sources to PM2.5 concentrations was increased in 2008, with diesel engines contributing 16.2 ± 5.9% and 14.5 ± 4.1% and gasoline vehicles contributing 10.3 ± 8.7% and 7.9 ± 6.2% to organic carbon (OC at PKU and Yufa, respectively. Due to the implementation of emission controls, the absolute OC concentrations from primary sources were reduced during the Olympics, and the contributions from secondary formation of OC represented a larger relative source of fine organic aerosols. Compared with the non-controlled period prior to the Olympics, primary vehicle contributions were reduced by 30% at the urban site and 24% at the rural site. The reductions in coal combustion contributions were 57% at PKU and 7% at Yufa. Our results demonstrate that the emission control measures implemented in 2008 significantly alleviated the primary organic particle pollution in and around Beijing. However, additional studies are needed to provide a more comprehensive assessment of the emission control effectiveness on SOA formation.

  9. Quantification of mitral regurgitation on cardiac computed tomography: comparison with qualitative and quantitative echocardiographic parameters.

    LENUS (Irish Health Repository)

    Arnous, Samer

    2012-02-01

    PURPOSE: To assess whether cardiac computed tomographic angiography (CCTA) can quantify the severity of chronic mitral regurgitation (MR) compared to qualitative and quantitative echocardiographic parameters. MATERIALS AND METHODS: Cardiac computed tomographic angiography was performed in 23 patients (mean +\\/- SD age, 63 +\\/- 16 years; range, 24-86 years) with MR and 20 patients without MR (controls) as determined by transthoracic echocardiography. Multiphasic reconstructions (20 data sets reconstructed at 5% increments of the electrocardiographic gated R-R interval) were used to analyze the mitral valve. Using CCTA planimetry, 2 readers measured the regurgitant mitral orifice area (CCTA ROA) during systole. A qualitative echocardiographic assessment of severity of MR was made by visual assessment of the length of the regurgitant jet. Quantitative echocardiographic measurements included the vena contracta, proximal isovelocity surface area, regurgitant volume, and estimated regurgitant orifice (ERO). Comparisons were performed using the independent t test, and correlations were assessed using the Spearman rank test. RESULTS: All controls and the patients with MR were correctly identified by CCTA. For patients with mild, moderate, or severe MR, mean +\\/- SD EROs were 0.16 +\\/- 0.03, 0.31 +\\/- 0.08, and 0.52 +\\/- 0.03 cm(2) (P < 0.0001) compared with mean +\\/- SD CCTA ROAs 0.09 +\\/- 0.05, 0.30 +\\/- 0.04, and 0.97 +\\/- 0.26 cm(2) (P < 0.0001), respectively. When echocardiographic measurements were graded qualitatively as mild, moderate, or severe, strong correlations were seen with CCTA ROA (R = 0.89; P < 0.001). When echocardiographic measurements were graded quantitatively, the vena contracta and the ERO showed modest correlations with CCTA ROA (0.48 and 0.50; P < 0.05 for both). Neither the proximal isovelocity surface area nor the regurgitant volume demonstrated significant correlations with CCTA ROA. CONCLUSIONS: Single-source 64-slice CCTA provides a

  10. Quantitative EDXS: Influence of geometry on a four detector system

    International Nuclear Information System (INIS)

    Kraxner, Johanna; Schäfer, Margit; Röschel, Otto; Kothleitner, Gerald; Haberfehlner, Georg; Paller, Manuel; Grogger, Werner

    2017-01-01

    The influence of the geometry on quantitative energy dispersive X-ray spectrometry (EDXS) analysis is determined for a ChemiSTEM system (Super-X) in combination with a low-background double-tilt specimen holder. For the first time a combination of experimental measurements with simulations is used to determine the positions of the individual detectors of a Super-X system. These positions allow us to calculate the detector's solid angles and estimate the amount of detector shadowing and its influence on quantitative EDXS analysis, including absorption correction using the ζ-factor method. Both shadowing by the brass portions and the beryllium specimen carrier of the holder severely affect the quantification of low to medium atomic number elements. A multi-detector system is discussed in terms of practical consequences of the described effects, and a quantitative evaluation of a Fayalit sample is demonstrated. Corrections and suggestions for minimizing systematic errors are discussed to improve quantitative methods for a multi-detector system. - Highlights: • Geometrical issues for EDXS quantification on a Super-X system. • Realistic model of a specimen holder using X-ray computed tomography. • Determination of the exact detector positions of a Super-X system. • Influence of detector shadowing and Be specimen carrier on quantitative EDXS.

  11. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  12. Total quantitative recording of elemental maps and spectra with a scanning microprobe

    International Nuclear Information System (INIS)

    Legge, G.J.F.; Hammond, I.

    1979-01-01

    A system of data recording and analysis has been developed by means of which simultaneously all data from a scanning instrument such as a microprobe can be quantitatively recorded and permanently stored, including spectral outputs from several detectors. Only one scanning operation is required on the specimen. Analysis is then performed on the stored data, which contain quantitative information on distributions of all elements and spectra of all regions

  13. An efficient polyenergetic SART (pSART) reconstruction algorithm for quantitative myocardial CT perfusion

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Yuan, E-mail: yuan.lin@duke.edu; Samei, Ehsan [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, 2424 Erwin Road, Suite 302, Durham, North Carolina 27705 (United States)

    2014-02-15

    Purpose: In quantitative myocardial CT perfusion imaging, beam hardening effect due to dense bone and high concentration iodinated contrast agent can result in visible artifacts and inaccurate CT numbers. In this paper, an efficient polyenergetic Simultaneous Algebraic Reconstruction Technique (pSART) was presented to eliminate the beam hardening artifacts and to improve the CT quantitative imaging ability. Methods: Our algorithm made threea priori assumptions: (1) the human body is composed of several base materials (e.g., fat, breast, soft tissue, bone, and iodine); (2) images can be coarsely segmented to two types of regions, i.e., nonbone regions and noniodine regions; and (3) each voxel can be decomposed into a mixture of two most suitable base materials according to its attenuation value and its corresponding region type information. Based on the above assumptions, energy-independent accumulated effective lengths of all base materials can be fast computed in the forward ray-tracing process and be used repeatedly to obtain accurate polyenergetic projections, with which a SART-based equation can correctly update each voxel in the backward projecting process to iteratively reconstruct artifact-free images. This approach effectively reduces the influence of polyenergetic x-ray sources and it further enables monoenergetic images to be reconstructed at any arbitrarily preselected target energies. A series of simulation tests were performed on a size-variable cylindrical phantom and a realistic anthropomorphic thorax phantom. In addition, a phantom experiment was also performed on a clinical CT scanner to further quantitatively validate the proposed algorithm. Results: The simulations with the cylindrical phantom and the anthropomorphic thorax phantom showed that the proposed algorithm completely eliminated beam hardening artifacts and enabled quantitative imaging across different materials, phantom sizes, and spectra, as the absolute relative errors were reduced

  14. An efficient polyenergetic SART (pSART) reconstruction algorithm for quantitative myocardial CT perfusion

    International Nuclear Information System (INIS)

    Lin, Yuan; Samei, Ehsan

    2014-01-01

    Purpose: In quantitative myocardial CT perfusion imaging, beam hardening effect due to dense bone and high concentration iodinated contrast agent can result in visible artifacts and inaccurate CT numbers. In this paper, an efficient polyenergetic Simultaneous Algebraic Reconstruction Technique (pSART) was presented to eliminate the beam hardening artifacts and to improve the CT quantitative imaging ability. Methods: Our algorithm made threea priori assumptions: (1) the human body is composed of several base materials (e.g., fat, breast, soft tissue, bone, and iodine); (2) images can be coarsely segmented to two types of regions, i.e., nonbone regions and noniodine regions; and (3) each voxel can be decomposed into a mixture of two most suitable base materials according to its attenuation value and its corresponding region type information. Based on the above assumptions, energy-independent accumulated effective lengths of all base materials can be fast computed in the forward ray-tracing process and be used repeatedly to obtain accurate polyenergetic projections, with which a SART-based equation can correctly update each voxel in the backward projecting process to iteratively reconstruct artifact-free images. This approach effectively reduces the influence of polyenergetic x-ray sources and it further enables monoenergetic images to be reconstructed at any arbitrarily preselected target energies. A series of simulation tests were performed on a size-variable cylindrical phantom and a realistic anthropomorphic thorax phantom. In addition, a phantom experiment was also performed on a clinical CT scanner to further quantitatively validate the proposed algorithm. Results: The simulations with the cylindrical phantom and the anthropomorphic thorax phantom showed that the proposed algorithm completely eliminated beam hardening artifacts and enabled quantitative imaging across different materials, phantom sizes, and spectra, as the absolute relative errors were reduced

  15. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  16. Characteristics of quantitative nursing research from 1990 to 2010.

    Science.gov (United States)

    Yarcheski, Adela; Mahon, Noreen E

    2013-12-01

    To assess author credentials of quantitative research in nursing, the composition of the research teams, and the disciplinary focus of the theories tested. Nursing Research, Western Journal of Nursing Research, and Journal of Advanced Nursing were selected for this descriptive study; 1990, 1995, 2000, 2005, and 2010 were included. The final sample consisted of 484 quantitative research articles. From 1990 to 2010, there was an increase in first authors holding doctoral degrees, research from other countries, and funding. Solo authorship decreased; multi-authorship and multidisciplinary teams increased. Theories tested were mostly from psychology; the testing of nursing theory was modest. Multidisciplinary research far outdistanced interdisciplinary research. Quantitative nursing research can be characterized as multidisciplinary (distinct theories from different disciplines) rather than discipline-specific to nursing. Interdisciplinary (theories synthesized from different disciplines) research has been conducted minimally. This study provides information about the growth of the scientific knowledge base of nursing, which has implications for practice. © 2013 Sigma Theta Tau International.

  17. Four Popular Books on Consumer Debt: A Context for Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Andrew J. Miller

    2011-01-01

    Full Text Available The topics of credit cards, mortgages, subprime lending, and fringe banking are rich sources of problems and discussions for classes focused on quantitative literacy. In this theme book review, we look at four recent books on the consumer debt industry: Credit Card Nation, by Robert Manning; Maxed Out, by James Scurlock; Collateral Damaged, by Charles Geisst; and Broke, USA, by Gary Rivlin. Credit Card Nation takes a scholarly look at the history of credit in America with a focus on the genesis and growth of the credit card industry up to the turn of the 20th century. Maxed Out also examines the credit card industry, but its approach is to highlight the stories of individuals struggling with debt and thereby examine some of the damaging effects of credit card debt in the United States. Collateral Damaged is a timely exploration of the root causes at the institutional level of the credit crisis that began in 2008. Broke USA focuses on high-cost financing (pawn shops, payday loans, title loans, describing the history of what Rivlin calls the "poverty industry" and the political and legal challenges critics have mounted against the industry. Each of these books has something to offer a wide variety of quantitative literacy classes, providing scenarios, statistics, and problems worthy of examination. After reviewing each of the four books, we provide several examples of such quantitative literacy applications and close with some thoughts on the relationship between financial literacy and quantitative literacy.

  18. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  19. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  20. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    Energy Technology Data Exchange (ETDEWEB)

    Omata, Noriyasu; Shirayama, Susumu, E-mail: omata@nakl.t.u-tokyo.ac.jp, E-mail: sirayama@sys.t.u-tokyo.ac.jp [Department of Systems Innovation, School of Engineering, The University of Tokyo, Hongo 7-3-1, Bunkyo-ku, Tokyo, 113-8656 (Japan)

    2017-10-15

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  1. A study on quantitative V and V of safety-critical software

    International Nuclear Information System (INIS)

    Eom, H. S.; Kang, H. G.; Chang, S. C.; Ha, J. J.; Son, H. S.

    2004-03-01

    Recently practical needs have required quantitative features for the software reliability for Probabilistic Safety Assessment which is one of the important methods being used in assessing the overall safety of nuclear power plant. But the conventional assessment methods of software reliability could not provide enough information for PSA of NPP, therefore current assessments of a digital system which includes safety-critical software usually exclude the software part or use arbitrary values. This paper describes a Bayesian Belief Networks based method that models the rule-based qualitative software assessment method for a practical use and can produce quantitative results for PSA. The framework was constructed by utilizing BBN that can combine the qualitative and quantitative evidence relevant to the reliability of safety-critical software and can infer a conclusion in a formal and a quantitative way. The case study was performed by applying the method for assessing the quality of software requirement specification of safety-critical software that will be embedded in reactor protection system

  2. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    International Nuclear Information System (INIS)

    Omata, Noriyasu; Shirayama, Susumu

    2017-01-01

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  3. Quantitative Cardiac Assessment in Fetal Tetralogy of Fallot.

    Science.gov (United States)

    Jatavan, Phudit; Tongprasert, Fuanglada; Srisupundit, Kasemsri; Luewan, Suchaya; Traisrisilp, Kuntharee; Tongsong, Theera

    2016-07-01

    The purpose of this study was to quantitatively assess cardiac function and biometric parameters in fetuses with a diagnosis of tetralogy of Fallot and compare them to those in healthy fetuses. Two hundred healthy fetuses and 20 fetuses with a diagnosis of classic tetralogy of Fallot were quantitatively assessed for 16 cardiac parameters, including morphologic characteristics and functions. All recruited fetuses were in the second trimester with correct gestational ages. The measured values that were out of normal reference ranges were considered abnormal. Rates of abnormalities of these parameters were compared between the groups. The significant parameters were further analyzed for their sensitivity, specificity, and likelihood ratio. Of the 16 parameters, rates of abnormalities in 7 parameters, including right ventricular wall thickness, peak systolic velocities (PSVs) in the pulmonary artery and aorta, time to peak velocity, or acceleration time, in the pulmonary artery, aortic valve diameter, pulmonary valve diameter, and aortic-to-pulmonary valve diameter ratio, were significantly higher in fetuses with tetralogy of Fallot (P tetralogy of Fallot.

  4. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  5. Addition of Adapted Optics towards obtaining a quantitative detection of diabetic retinopathy

    Science.gov (United States)

    Yust, Brian; Obregon, Isidro; Tsin, Andrew; Sardar, Dhiraj

    2009-04-01

    An adaptive optics system was assembled for correcting the aberrated wavefront of light reflected from the retina. The adaptive optics setup includes a superluminous diode light source, Hartmann-Shack wavefront sensor, deformable mirror, and imaging CCD camera. Aberrations found in the reflected wavefront are caused by changes in the index of refraction along the light path as the beam travels through the cornea, lens, and vitreous humour. The Hartmann-Shack sensor allows for detection of aberrations in the wavefront, which may then be corrected with the deformable mirror. It has been shown that there is a change in the polarization of light reflected from neovascularizations in the retina due to certain diseases, such as diabetic retinopathy. The adaptive optics system was assembled towards the goal of obtaining a quantitative measure of onset and progression of this ailment, as one does not currently exist. The study was done to show that the addition of adaptive optics results in a more accurate detection of neovascularization in the retina by measuring the expected changes in polarization of the corrected wavefront of reflected light.

  6. A systems study of an RF power source for a 1 TeV next linear collider based upon the relativistic-klystron two-beam accelerator

    International Nuclear Information System (INIS)

    Yu, S.; Goffeney, N.; Deadrick, F.

    1994-11-01

    A systems study, including physics, engineering and costing, has been conducted to assess the feasibility of a relativistic-klystron two-beam-accelerator (RK-TBA) system as a RF power source candidate for a 1 TeV linear collider. Several key issues associated with a realizable RK-TBA system have been addressed, and corresponding schemes have been developed and examined quantitatively. A point design example has been constructed to present a concrete conceptual design which has acceptable transverse and longitudinal beam stability properties. The overall efficiency of RF production for such a power source is estimated to be 36%, and the cost of the full system is estimated to be less than 1 billion dollars

  7. Parts of the Whole: Quantitative Literacy on a Desert Island

    Directory of Open Access Journals (Sweden)

    Dorothy Wallace

    2015-07-01

    Full Text Available Some of the specific institutional problems faced by quantitative reasoning courses, programs and requirements arise from the fragile intellectual position of “quantitative reasoning” as an idea, or meme. The process of isolation and reintroduction explains both the proliferation of living species and the way in which some difficult ideas take their place in a culture. Using evolutionary explanations as metaphor and the Copernican revolution as an example of a difficult idea, we draw lessons that can be applied to the “quantitative reasoning” meme, including the function of the National Numeracy Network as an island of protected discourse favoring the growth of the QR meme. We conclude that the mission of the National Numeracy Network should focus on attributes of that island, and in particular extend the mission beyond being a network, to being an actual community.

  8. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  9. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  10. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    Science.gov (United States)

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-05-05

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  11. Repeatability, interocular correlation and agreement of quantitative swept-source optical coherence tomography angiography macular metrics in healthy subjects.

    Science.gov (United States)

    Fang, Danqi; Tang, Fang Yao; Huang, Haifan; Cheung, Carol Y; Chen, Haoyu

    2018-05-29

    To investigate the repeatability, interocular correlation and agreement of quantitative swept-source optical coherence tomography angiography (SS-OCTA) metrics in healthy subjects. Thirty-three healthy normal subjects were enrolled. The macula was scanned four times by an SS-OCTA system using the 3 mm×3 mm mode. The superficial capillary map images were analysed using a MATLAB program. A series of parameters were measured: foveal avascular zone (FAZ) area, FAZ perimeter, FAZ circularity, parafoveal vessel density, fractal dimension and vessel diameter index (VDI). The repeatability of four scans was determined by intraclass correlation coefficient (ICC). Then the averaged results were analysed for intereye difference, correlation and agreement using paired t-test, Pearson's correlation coefficient (r), ICC and Bland-Altman plot. The repeatability assessment of the macular metrics exported high ICC values (ranged from 0.853 to 0.996). There is no statistically significant difference in the OCTA metrics between the two eyes. FAZ area (ICC=0.961, r=0.929) and FAZ perimeter (ICC=0.884, r=0.802) showed excellent binocular correlation. Fractal dimension (ICC=0.732, r=0.578) and VDI (ICC=0.707, r=0.547) showed moderate binocular correlation, while parafoveal vessel density had poor binocular correlation. Bland-Altman plots showed the range of agreement was from -0.0763 to 0.0954 mm 2 for FAZ area and from -0.0491 to 0.1136 for parafoveal vessel density. The macular metrics obtained using SS-OCTA showed excellent repeatability in healthy subjects. We showed high intereye correlation in FAZ area and perimeter, moderate correlation in fractal dimension and VDI, while vessel density had poor correlation in normal healthy subjects. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. A GIS-based multi-source and multi-box modeling approach (GMSMB) for air pollution assessment--a North American case study.

    Science.gov (United States)

    Wang, Bao-Zhen; Chen, Zhi

    2013-01-01

    This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.

  13. Optical Computed-Tomographic Microscope for Three-Dimensional Quantitative Histology

    Directory of Open Access Journals (Sweden)

    Ravil Chamgoulov

    2004-01-01

    Full Text Available A novel optical computed‐tomographic microscope has been developed allowing quantitative three‐dimensional (3D imaging and analysis of fixed pathological material. Rather than a conventional two‐dimensional (2D image, the instrument produces a 3D representation of fixed absorption‐stained material, from which quantitative histopathological features can be measured more accurately. The accurate quantification of these features is critically important in disease diagnosis and the clinical classification of cancer. The system consists of two high NA objective lenses, a light source, a digital spatial light modulator (DMD, by Texas Instrument, an x–y stage, and a CCD detector. The DMD, positioned at the back pupil‐plane of the illumination objective, is employed to illuminate the specimen with parallel rays at any desired angle. The system uses a modification of the convolution backprojection algorithm for reconstruction. In contrast to fluorescent images acquired by a confocal microscope, this instrument produces 3D images of absorption stained material. Microscopic 3D volume reconstructions of absorption‐stained cells have been demonstrated. Reconstructed 3D images of individual cells and tissue can be cut virtually with the distance between the axial slices less than 0.5 μm.

  14. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  15. Advanced source apportionment of carbonaceous aerosols by coupling offline AMS and radiocarbon size-segregated measurements over a nearly 2-year period

    OpenAIRE

    Vlachou, Athanasia; Daellenbach, Kaspar R.; Bozzetti, Carlo; Chazeau, Benjamin; Salazar Quintero, Gary Abdiel; Szidat, Sönke; Jaffrezo, Jean-Luc; Hueglin, Christoph; Baltensperger, Urs; Haddad, Imad El; Prévôt, André S. H.

    2018-01-01

    Carbonaceous aerosols are related to adverse human health effects. Therefore, identification of their sources and analysis of their chemical composition is important. The offline AMS (aerosol mass spectrometer) technique offers quantitative separation of organic aerosol (OA) factors which can be related to major OA sources, either primary or secondary. While primary OA can be more clearly separated into sources, secondary (SOA) source apportionment is more challenging because different source...

  16. Ionization detector with improved radiation source

    International Nuclear Information System (INIS)

    Solomon, E.F.

    1977-01-01

    The detector comprises a chamber having at least one radiation source disposed therein. The chamber includes spaced collector plates which form a part of a detection circuit for sensing changes in the ionization current in the chamber. The radiation source in one embodiment is in the form of a wound wire or ribbon suitably supported in the chamber and preferably a source of beta particles. The chamber may also include an adjustable electrode and the source may function as an adjustable current source by forming the wire or ribbon in an eliptical shape and rotating the structure. In another embodiment the source has a random shape and is homogeneously disposed in the chamber. 13 claims, 5 drawing figures

  17. Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.

    Science.gov (United States)

    Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie

    2018-01-01

    The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Convergent and sequential synthesis designs: implications for conducting and reporting systematic reviews of qualitative and quantitative evidence.

    Science.gov (United States)

    Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy

    2017-03-23

    Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an

  19. Impedance source power electronic converters

    CERN Document Server

    Liu, Yushan; Ge, Baoming; Blaabjerg, Frede; Ellabban, Omar; Loh, Poh Chiang

    2016-01-01

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding control methods. Presents the latest power conversion solutions that aim to advance the role of pow...

  20. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  2. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  3. Health impacts of different energy sources

    International Nuclear Information System (INIS)

    1982-01-01

    Energy is needed to sustain the economy, health and welfare of nations. As a consequence of this, energy consumption figures are frequently used as an index of a nation's advancement. As a result of the global energy crisis, almost every nation has had to develop all its available energy resources and plan its future use of energy. The planners of national and international energy policies are however often faced with a problem of 'public acceptance' arising from the potential health and environmental impacts of developing energy resources. The public's desire to preserve the quality of man's health and his environment frequently results in opposition to many industrial innovations, including the generation and use of energy. Reliable, quantitative data and information are needed on the risks to health and the environment of different contemporary energy systems, to improve public understanding, and to serve as the basis from which national planners can choose between different energy supply options. With the exception of nuclear energy, even in technologically advanced countries little systematic research and development has been done on the quantitative assessment of the effects on health and the environment of the conventional energy sources. The need for this information has only been realized over the past decade as the climate and environment in many regions of the world has deteriorated with the unabated release of pollutants from factories and energy generating plants in particular. A number of countries have started national environmental health research programmes to monitor and regulate toxic emissions from industry and energy plants. Energy-related environmental health research has been supported and co-ordinated by various international organizations such as the International Atomic Energy Agency (IAEA), World Health Organization (WHO) and United Nations Environment Programme (UNEP). WHO has supported expert reviews on the potential health risks posed

  4. Planar gamma camera imaging and quantitation of Yttrium-90 bremsstrahlung

    International Nuclear Information System (INIS)

    Shen, S.; DeNardo, G.L.; Yuan, A.

    1994-01-01

    Yttrium-90 is a promising radionuclide for radioimmunotherapy of cancer because of its energetic beta emissions. Therapeutic management requires quantitative imaging to assess the pharmacokinetics and radiation dosimetry of the 90 Y-labeled antibody. Conventional gamma photon imaging methods cannot be easily applied to imaging of 90 Y-bremsstrahlung because of its continuous energy spectrum. The sensitivity, resolution and source-to-background signal ratio (S/B) of the detector system for 90 Y-bremsstrahlung were investigated for various collimators and energy windows in order to determine optimum conditions for quantitative imaging. After these conditions were determined, the accuracy of quantitation of 90 Y activity in an Alderson abdominal phantom was examined. When the energy-window width was increased, the benefit of increased sensitivity outweighed degradation in resolution and S/B ratio until the manufacturer's energy specifications for the collimator were exceeded. Using the same energy window, the authors improved resolution and S/B for the medium-energy (ME) collimator when compared to the low-energy, all-purpose (LEAP) collimator, and there was little additional improvement using the high-energy (HE) collimator. Camera sensitivity under tissue equivalent conditions was 4.2 times greater for the LEAP and 1.7 times greater for the ME collimators when compared to the HE collimator. Thus, the best, most practical selections were found to be the ME collimator and an energy window of 55-285 keV. When they used these optimal conditions for image acquisition, the estimation of 90 Y activity in organs and tumors was within 15% of the true activities. The results for this study suggest that reasonable accuracy can be achieved in clinical radioimmunotherapy using 90 Y-bremsstrahlung quantitation. 28 refs., 5 figs., 7 tabs

  5. WormGender - Open-Source Software for Automatic Caenorhabditis elegans Sex Ratio Measurement.

    Directory of Open Access Journals (Sweden)

    Marta K Labocha

    Full Text Available Fast and quantitative analysis of animal phenotypes is one of the major challenges of current biology. Here we report the WormGender open-source software, which is designed for accurate quantification of sex ratio in Caenorhabditis elegans. The software functions include, i automatic recognition and counting of adult hermaphrodites and males, ii a manual inspection feature that enables manual correction of errors, and iii flexibility to use new training images to optimize the software for different imaging conditions. We evaluated the performance of our software by comparing manual and automated assessment of sex ratio. Our data showed that the WormGender software provided overall accurate sex ratio measurements. We further demonstrated the usage of WormGender by quantifying the high incidence of male (him phenotype in 27 mutant strains. Mutants of nine genes (brc-1, C30G12.6, cep-1, coh-3, him-3, him-5, him-8, skr-1, unc-86 showed significant him phenotype. The WormGender is written in Java and can be installed and run on both Windows and Mac platforms. The source code is freely available together with a user manual and sample data at http://www.QuantWorm.org/. The source code and sample data are also available at http://dx.doi.org/10.6084/m9.figshare.1541248.

  6. Source apportionment of fine particulate matter in China in 2013 using a source-oriented chemical transport model.

    Science.gov (United States)

    Shi, Zhihao; Li, Jingyi; Huang, Lin; Wang, Peng; Wu, Li; Ying, Qi; Zhang, Hongliang; Lu, Li; Liu, Xuejun; Liao, Hong; Hu, Jianlin

    2017-12-01

    China has been suffering high levels of fine particulate matter (PM 2.5 ). Designing effective PM 2.5 control strategies requires information about the contributions of different sources. In this study, a source-oriented Community Multiscale Air Quality (CMAQ) model was applied to quantitatively estimate the contributions of different source sectors to PM 2.5 in China. Emissions of primary PM 2.5 and gas pollutants of SO 2 , NO x , and NH 3 , which are precursors of particulate sulfate, nitrate, and ammonium (SNA, major PM 2.5 components in China), from eight source categories (power plants, residential sources, industries, transportation, open burning, sea salt, windblown dust and agriculture) were separately tracked to determine their contributions to PM 2.5 in 2013. Industrial sector is the largest source of SNA in Beijing, Xi'an and Chongqing, followed by agriculture and power plants. Residential emissions are also important sources of SNA, especially in winter when severe pollution events often occur. Nationally, the contributions of different source sectors to annual total PM 2.5 from high to low are industries, residential sources, agriculture, power plants, transportation, windblown dust, open burning and sea salt. Provincially, residential sources and industries are the major anthropogenic sources of primary PM 2.5 , while industries, agriculture, power plants and transportation are important for SNA in most provinces. For total PM 2.5 , residential and industrial emissions are the top two sources, with a combined contribution of 40-50% in most provinces. The contributions of power plants and agriculture to total PM 2.5 are about 10%, respectively. Secondary organic aerosol accounts for about 10% of annual PM 2.5 in most provinces, with higher contributions in southern provinces such as Yunnan (26%), Hainan (25%) and Taiwan (21%). Windblown dust is an important source in western provinces such as Xizang (55% of total PM 2.5 ), Qinghai (74%), Xinjiang (59

  7. Mass spectrometry as a quantitative tool in plant metabolomics

    Science.gov (United States)

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  8. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  9. Chandra Source Catalog: User Interface

    Science.gov (United States)

    Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.

  10. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    Science.gov (United States)

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-11-01

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The PSV, EDV, and pixel intensity are valuable in determining moderate to severe CKD. The value of shear wave velocity in

  11. Impact of intentionally introduced sources on indoor VOC levels

    Energy Technology Data Exchange (ETDEWEB)

    Davis, C.S. [BOVAR Environmental, Downsview, Ontario (Canada); Otson, R. [Health Canada, Ottawa, Ontario (Canada). Environmental Health Centre

    1997-12-31

    The concentrations of 33 target volatile organic compounds (VOC) were measured in outdoor air and in indoor air before and after the introduction of dry-cleaned clothes, and consumer products into two suburban homes. Emissions from the household products (air fresheners, furniture polishes, mothballs, and dry-cleaned clothes), showering, and two paints were analyzed to obtain source profiles. There were measurable increases in the 24 h average concentrations for 10 compounds in one house and 8 compounds in the second house after introduction of the sources. A contribution by showering to indoor VOC was not evident although the impact of the other sources and outdoor air could be discerned, based on results for the major constituents of source emissions. Also, contributions by paints, applied three to six weeks prior to the monitoring, to indoor VOC concentrations were evident. The pattern of concentrations indicated that sink effects need to be considered in explaining the indoor concentrations that result when sources are introduced into homes. Quantitative estimates of the relative contributions of the sources to indoor VOC levels were not feasible through the use of chemical mass balance since the number of tracer species detected (up to 6) and that could be used for source apportionment was similar to the number of sources to be apportioned (up to 7).

  12. Quantitative imaging of protein targets in the human brain with PET

    International Nuclear Information System (INIS)

    Gunn, Roger N; Slifstein, Mark; Searle, Graham E; Price, Julie C

    2015-01-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  13. Quantitative imaging of protein targets in the human brain with PET

    Science.gov (United States)

    Gunn, Roger N.; Slifstein, Mark; Searle, Graham E.; Price, Julie C.

    2015-11-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  14. A simplistic view of the iodine chemistry influence on source term assessment

    International Nuclear Information System (INIS)

    Herranz, L.E.; Rodriguez, J.J.

    1994-01-01

    The intrinsic characteristics of iodine make it a relevant concern as to its potential radiobiological impact in case of a hypothetical severe accident in nuclear power plants. This paper summarizes the major results drawn from a very simple but illustrative calculation exercise aimed at weighing how significant could be taking iodine chemistry in containment into account for source term assessments in case of a postulated severe reactor accident. The scenario chosen as representative of expected conditions in containment was LA-4 test of LACE programme. Several approximations and hypothesis concerning the scenario were necessary. Iodine chemistry analyses were performed with IODE code, as long as thermalhydraulic and aerosol behaviour analyses, providing initial and boundary conditions for iodine calculations, were carried out with CONTEMPT4/MOD5 and NAUA/MOD5 codes, respectively. In general, the results obtained agreed qualitatively with the current knowledge on the area; from a quantitative point of view, one of the major results was that iodine chemistry on acidic conditions could provide a substantial increase in the leaked mass from containment under the postulated circumstances. Hence, this study underlines the need of including iodine chemistry in source tenn assessments. (author)

  15. Quantitative interpretation of nuclear logging data by adopting point-by-point spectrum striping deconvolution technology

    International Nuclear Information System (INIS)

    Tang Bin; Liu Ling; Zhou Shumin; Zhou Rongsheng

    2006-01-01

    The paper discusses the gamma-ray spectrum interpretation technology on nuclear logging. The principles of familiar quantitative interpretation methods, including the average content method and the traditional spectrum striping method, are introduced, and their limitation of determining the contents of radioactive elements on unsaturated ledges (where radioactive elements distribute unevenly) is presented. On the basis of the intensity gamma-logging quantitative interpretation technology by using the deconvolution method, a new quantitative interpretation method of separating radioactive elements is presented for interpreting the gamma spectrum logging. This is a point-by-point spectrum striping deconvolution technology which can give the logging data a quantitative interpretation. (authors)

  16. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    Science.gov (United States)

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  17. Quantitative optical measurement of mitochondrial superoxide dynamics in pulmonary artery endothelial cells

    Directory of Open Access Journals (Sweden)

    Zahra Ghanian

    2018-01-01

    Full Text Available Reactive oxygen species (ROS play a vital role in cell signaling and redox regulation, but when present in excess, lead to numerous pathologies. Detailed quantitative characterization of mitochondrial superoxide anion (O2•− production in fetal pulmonary artery endothelia cells (PAECs has never been reported. The aim of this study is to assess mitochondrial O2•− production in cultured PAECs over time using a novel quantitative optical approach. The rate, the sources, and the dynamics of O2•− production were assessed using targeted metabolic modulators of the mitochondrial electron transport chain (ETC complexes, specifically an uncoupler and inhibitors of the various ETC complexes, and inhibitors of extra-mitochondrial sources of O2•−. After stabilization, the cells were loaded with nanomolar mitochondrial-targeted hydroethidine (Mito-HE, MitoSOX online during the experiment without washout of the residual dye. Time-lapse fluorescence microscopy was used to monitor the dynamic changes in O2•− fluorescence intensity over time in PAECs. The transient behaviors of the fluorescence time course showed exponential increases in the rate of O2•− production in the presence of the ETC uncoupler or inhibitors. The most dramatic and the fastest increase in O2•− production was observed when the cells were treated with the uncoupling agent, PCP. We also showed that only the complex IV inhibitor, KCN, attenuated the marked surge in O2•− production induced by PCP. The results showed that mitochondrial respiratory complexes I, III and IV are sources of O2•− production in PAECs, and a new observation that ROS production during uncoupling of mitochondrial respiration is mediated in part via complex IV. This novel method can be applied in other studies that examine ROS production under stress condition and during ROS-mediated injuries in vitro.

  18. A Directory of Sources of Information and Data Bases on Education and Training.

    Science.gov (United States)

    1980-09-01

    diaries and reports. CRT terminal§ (to support the personnel function at each NATRACOM activity), with the capability to call up standard formats for...quantitative for all Navy activities. Input or Source of Data: Primery sources of info to thq personnel files are OCR and diary entries from all Navy...current cataloging output of the VS L1. of Qong and for makn tese ata aya ou.e tp outside libraries and institutions on a subscripion basis. The OFFICE is

  19. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  20. Quinoa seed coats as an expanding and sustainable source of bioactive compounds

    DEFF Research Database (Denmark)

    Ruiz, Karina B.; Khakimov, Bekzod; Engelsen, Søren Balling

    2017-01-01

    Saponins (SAPs) are a diverse family of plant secondary metabolites and due to their biological activities, SAPs can be utilised as biopesticides and as therapeutic compounds. Given their widespread industrial use, a search for alternative sources of SAPs is a priority. Quinoa (Chenopodium quinoa...... Willd) is a valuable food source that is gaining importance worldwide for its nutritional and nutraceutical properties. SAPs from quinoa seed coats could represent a new sustainable source to obtain these compounds in high quantities due to the increasing production and worldwide expansion of the crop....... This research aims to characterise saponins of seed coat waste products from six different quinoa varieties for their potential use as a saponin source. Gas chromatography (GC)- and Liquid chromatography (LC)- with mass spectrometry (MS) were applied for qualitative and relative quantitative analysis...

  1. On the use of a laser ablation as a laboratory seismic source

    Science.gov (United States)

    Shen, Chengyi; Brito, Daniel; Diaz, Julien; Zhang, Deyuan; Poydenot, Valier; Bordes, Clarisse; Garambois, Stéphane

    2017-04-01

    Mimic near-surface seismic imaging conducted in well-controlled laboratory conditions is potentially a powerful tool to study large scale wave propagations in geological media by means of upscaling. Laboratory measurements are indeed particularly suited for tests of theoretical modellings and comparisons with numerical approaches. We have developed an automated Laser Doppler Vibrometer (LDV) platform, which is able to detect and register broadband nano-scale displacements on the surface of various materials. This laboratory equipment has already been validated in experiments where piezoelectric transducers were used as seismic sources. We are currently exploring a new seismic source in our experiments, a laser ablation, in order to compensate some drawbacks encountered with piezoelectric sources. The laser ablation source is considered to be an interesting ultrasound wave generator since the 1960s. It was believed to have numerous potential applications such as the Non-Destructive Testing (NDT) and the measurements of velocities and attenuations in solid samples. We aim at adapting and developing this technique into geophysical experimental investigations in order to produce and explore complete micro-seismic data sets in the laboratory. We will first present the laser characteristics including its mechanism, stability, reproducibility, and will evaluate in particular the directivity patterns of such a seismic source. We have started by applying the laser ablation source on the surfaces of multi-scale homogeneous aluminum samples and are now testing it on heterogeneous and fractured limestone cores. Some other results of data processing will also be shown, especially the 2D-slice V P and V S tomographic images obtained in limestone samples. Apart from the experimental records, numerical simulations will be carried out for both the laser source modelling and the wave propagation in different media. First attempts will be done to compare quantitatively the

  2. Dissociative conceptual and quantitative problem solving outcomes across interactive engagement and traditional format introductory physics

    Directory of Open Access Journals (Sweden)

    Mark A. McDaniel

    2016-11-01

    Full Text Available The existing literature indicates that interactive-engagement (IE based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.

  3. Differentiation of Cuscuta chinensis and Cuscuta australis by HPLC-DAD-MS analysis and HPLC-UV quantitation.

    Science.gov (United States)

    He, Xianghui; Yang, Wenzhi; Ye, Min; Wang, Qing; Guo, Dean

    2011-11-01

    Cuscuta chinensis and Cuscuta australis, the two botanical sources of the Chinese herbal medicine Tu-Si-Zi, were distinguished from each other based on qualitative and quantitative chemical analysis. By HPLC‑DAD‑MS, a total of 36 compounds were characterized from these two Cuscuta species, including 14 flavonoids, 17 quinic acid derivatives, and 5 lignans. In addition, HPLC‑UV was applied to determine seven major compounds (6 flavonoids plus chlorogenic acid) in 27 batches of Tu-Si-Zi. The results revealed that the amounts of the three classes of compounds varied significantly between the species. C. australis contained more flavonoids but less quinic acid derivatives and lignans than C. chinensis. Particularly, the amounts of kaempferol and astragalin in C. australis were remarkably higher than in C. chinensis. This finding could be valuable for the quality control of Tu-Si-Zi. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    Science.gov (United States)

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  5. Photogrammetry of the human brain: a novel method for 3D the quantitative exploration of the structural connectivity in neurosurgery and neurosciences.

    Science.gov (United States)

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomical awareness of brain's structural connectivity is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard micro-dissection is a validated technique to investigate the different white matter (WM) pathways and to verify results coming from tractography, the possibility of an interactive exploration of the specimens and of a reliable acquisition of quantitative information has not been described so far. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined 3D-models. The aim of this work is to propose the application of photogrammetric technique for supporting the 3D-exploration and the quantitative analysis on the cerebral WM connectivity. The main peri-sylvian pathways, including the superior longitudinal fascicle (SLF) and the arcuate fascicle (AF) were exposed using the Klingler's technique. The photogrammetric acquisition followed each dissection step. The point-clouds were registered to a reference MRI of the specimen. All the acquisitions were co-registered into an open-source model. We analyzed five steps, including: the cortical surface, the short intergyral fibers, the indirect posterior and anterior SLF, and the AF. The co-registration between the MRI mesh and the point-clouds models resulted highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D-reproduction of WM anatomy, and the acquisition of unlimited quantitative data directly on the real specimen during the post-dissection analysis. These results open many new promising neuroscientific and educational perspectives, also for optimizing the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Radioisotopic heat source

    Science.gov (United States)

    Jones, G.J.; Selle, J.E.; Teaney, P.E.

    1975-09-30

    Disclosed is a radioisotopic heat source and method for a long life electrical generator. The source includes plutonium dioxide shards and yttrium or hafnium in a container of tantalum-tungsten-hafnium alloy, all being in a nickel alloy outer container, and subjected to heat treatment of from about 1570$sup 0$F to about 1720$sup 0$F for about one h. (auth)

  7. Negative hydrogen ion sources for accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Moehs, D.P.; /Fermilab; Peters, J.; /DESY; Sherman, J.; /Los Alamos

    2005-08-01

    A variety of H{sup -} ion sources are in use at accelerator laboratories around the world. A list of these ion sources includes surface plasma sources with magnetron, Penning and surface converter geometries as well as magnetic-multipole volume sources with and without cesium. Just as varied is the means of igniting and maintaining magnetically confined plasmas. Hot and cold cathodes, radio frequency, and microwave power are all in use, as well as electron tandem source ignition. The extraction systems of accelerator H{sup -} ion sources are highly specialized utilizing magnetic and electric fields in their low energy beam transport systems to produce direct current, as well as pulsed and/or chopped beams with a variety of time structures. Within this paper, specific ion sources utilized at accelerator laboratories shall be reviewed along with the physics of surface and volume H{sup -} production in regard to source emittance. Current research trends including aperture modeling, thermal modeling, surface conditioning, and laser diagnostics will also be discussed.

  8. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    Science.gov (United States)

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  9. Quantification of source-term profiles from near-field geochemical models

    International Nuclear Information System (INIS)

    McKinley, I.G.

    1985-01-01

    A geochemical model of the near-field is described which quantitatively treats the processes of engineered barrier degradation, buffering of aqueous chemistry by solid phases, nuclide solubilization and transport through the near-field and release to the far-field. The radionuclide source-terms derived from this model are compared with those from a simpler model used for repository safety analysis. 10 refs., 2 figs., 2 tabs

  10. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif; Schuster, Gerard T.

    2014-01-01

    are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first

  11. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  12. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  13. Can quantitative sensory testing predict responses to analgesic treatment?

    Science.gov (United States)

    Grosen, K; Fischer, I W D; Olesen, A E; Drewes, A M

    2013-10-01

    The role of quantitative sensory testing (QST) in prediction of analgesic effect in humans is scarcely investigated. This updated review assesses the effectiveness in predicting analgesic effects in healthy volunteers, surgical patients and patients with chronic pain. A systematic review of English written, peer-reviewed articles was conducted using PubMed and Embase (1980-2013). Additional studies were identified by chain searching. Search terms included 'quantitative sensory testing', 'sensory testing' and 'analgesics'. Studies on the relationship between QST and response to analgesic treatment in human adults were included. Appraisal of the methodological quality of the included studies was based on evaluative criteria for prognostic studies. Fourteen studies (including 720 individuals) met the inclusion criteria. Significant correlations were observed between responses to analgesics and several QST parameters including (1) heat pain threshold in experimental human pain, (2) electrical and heat pain thresholds, pressure pain tolerance and suprathreshold heat pain in surgical patients, and (3) electrical and heat pain threshold and conditioned pain modulation in patients with chronic pain. Heterogeneity among studies was observed especially with regard to application of QST and type and use of analgesics. Although promising, the current evidence is not sufficiently robust to recommend the use of any specific QST parameter in predicting analgesic response. Future studies should focus on a range of different experimental pain modalities rather than a single static pain stimulation paradigm. © 2013 European Federation of International Association for the Study of Pain Chapters.

  14. Advanced source apportionment of carbonaceous aerosols by coupling offline AMS and radiocarbon size-segregated measurements over a nearly 2-year period

    OpenAIRE

    A. Vlachou; K. R. Daellenbach; C. Bozzetti; B. Chazeau; G. A. Salazar; S. Szidat; J.-L. Jaffrezo; C. Hueglin; U. Baltensperger; I. E. Haddad; A. S. H. Prévôt

    2018-01-01

    Carbonaceous aerosols are related to adverse human health effects. Therefore, identification of their sources and analysis of their chemical composition is important. The offline AMS (aerosol mass spectrometer) technique offers quantitative separation of organic aerosol (OA) factors which can be related to major OA sources, either primary or secondary. While primary OA can be more clearly separated into sources, secondary (SOA) source apportionment is more challenging because d...

  15. Rapid Quantitative Determination of Squalene in Shark Liver Oils by Raman and IR Spectroscopy.

    Science.gov (United States)

    Hall, David W; Marshall, Susan N; Gordon, Keith C; Killeen, Daniel P

    2016-01-01

    Squalene is sourced predominantly from shark liver oils and to a lesser extent from plants such as olives. It is used for the production of surfactants, dyes, sunscreen, and cosmetics. The economic value of shark liver oil is directly related to the squalene content, which in turn is highly variable and species-dependent. Presented here is a validated gas chromatography-mass spectrometry analysis method for the quantitation of squalene in shark liver oils, with an accuracy of 99.0 %, precision of 0.23 % (standard deviation), and linearity of >0.999. The method has been used to measure the squalene concentration of 16 commercial shark liver oils. These reference squalene concentrations were related to infrared (IR) and Raman spectra of the same oils using partial least squares regression. The resultant models were suitable for the rapid quantitation of squalene in shark liver oils, with cross-validation r (2) values of >0.98 and root mean square errors of validation of ≤4.3 % w/w. Independent test set validation of these models found mean absolute deviations of the 4.9 and 1.0 % w/w for the IR and Raman models, respectively. Both techniques were more accurate than results obtained by an industrial refractive index analysis method, which is used for rapid, cheap quantitation of squalene in shark liver oils. In particular, the Raman partial least squares regression was suited to quantitative squalene analysis. The intense and highly characteristic Raman bands of squalene made quantitative analysis possible irrespective of the lipid matrix.

  16. Emission sources and quantities

    International Nuclear Information System (INIS)

    Heinen, B.

    1991-01-01

    The paper examines emission sources and quantities for SO 2 and NO x . Natural SO 2 is released from volcanic sources and to a much lower extent from marsh gases. In nature NO x is mainly produced in the course of the chemical and bacterial denitrification processes going on in the soil. Manmade pollutants are produced in combustion processes. The paper concentrates on manmade pollution. Aspects discussed include: mechanism of pollution development; manmade emission sources (e.g. industry, traffic, power plants and domestic sources); and emission quantities and forecasts. 11 refs., 2 figs., 5 tabs

  17. Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking

    Science.gov (United States)

    Miele, David B.; Wigfield, Allan

    2014-01-01

    The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…

  18. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    Science.gov (United States)

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  19. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  20. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  2. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  3. Rosmarinus Officinalis Leaves as a Natural Source of Bioactive Compounds

    Directory of Open Access Journals (Sweden)

    Isabel Borrás-Linares

    2014-11-01

    Full Text Available In an extensive search for bioactive compounds from plant sources, the composition of different extracts of rosemary leaves collected from different geographical zones of Serbia was studied. The qualitative and quantitative characterization of 20 rosemary (Rosmarinus officinalis samples, obtained by microwave-assisted extraction (MAE, was determined by high performance liquid chromatography coupled to electrospray quadrupole-time of flight mass spectrometry (HPLC–ESI-QTOF-MS. The high mass accuracy and true isotopic pattern in both MS and MS/MS spectra provided by the QTOF-MS analyzer enabled the characterization of a wide range of phenolic compounds in the extracts, including flavonoids, phenolic diterpenes and abietan-type triterpenoids, among others. According to the data compiled, rosemary samples from Sokobanja presented the highest levels in flavonoids and other compounds such as carnosol, rosmaridiphenol, rosmadial, rosmarinic acid, and carnosic acid. On the other hand, higher contents in triterpenes were found in the extracts of rosemary from Gložan (Vojvodina.

  4. Stellar X-ray sources

    International Nuclear Information System (INIS)

    Katz, J.I.; Washington Univ., St. Louis, MO

    1988-01-01

    I Review some of the salient accomplishments of X-rap studies of compact objects. Progress in this field has closely followed the improvement of observational methods, particularly in angular resolution and duration of exposure. Luminous compact X-ray sources are accreting neutron stars or black holes. Accreting neutron stars may have characteristic temporal signatures, but the only way to establish that an X-ray source is a black hole is to measure its mass. A rough phenomenological theory is succesful, but the transport of angular momentum in accretion flows is not onderstood. A number of interesting complications have been observed, including precessing accretion discs, X-ray bursts, and the acceleration of jets in SS433. Many puzzles remain unsolved, including the excitation of disc precession, the nature of the enigmatic A- and gamma-ray source Cyg X-3, the mechanism by which slowly spinning accreting neutron stars lose angular momentum, and the superabundance of X-ray sources in globular clusters. 41 refs.; 5 figs

  5. Photon sources for absorptiometric measurements

    International Nuclear Information System (INIS)

    Witt, R.M.; Sandrik, J.M.; Cameron, J.R.

    1976-01-01

    Photon absorptiometry is defined and the requirements of photon sources for these measurements are described. Both x-ray tubes and radionuclide sources are discussed, including the advantages of each in absorptiometric systems

  6. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  7. Receptor modeling for source apportionment of polycyclic aromatic hydrocarbons in urban atmosphere.

    Science.gov (United States)

    Singh, Kunwar P; Malik, Amrita; Kumar, Ranjan; Saxena, Puneet; Sinha, Sarita

    2008-01-01

    This study reports source apportionment of polycyclic aromatic hydrocarbons (PAHs) in particulate depositions on vegetation foliages near highway in the urban environment of Lucknow city (India) using the principal components analysis/absolute principal components scores (PCA/APCS) receptor modeling approach. The multivariate method enables identification of major PAHs sources along with their quantitative contributions with respect to individual PAH. The PCA identified three major sources of PAHs viz. combustion, vehicular emissions, and diesel based activities. The PCA/APCS receptor modeling approach revealed that the combustion sources (natural gas, wood, coal/coke, biomass) contributed 19-97% of various PAHs, vehicular emissions 0-70%, diesel based sources 0-81% and other miscellaneous sources 0-20% of different PAHs. The contributions of major pyrolytic and petrogenic sources to the total PAHs were 56 and 42%, respectively. Further, the combustion related sources contribute major fraction of the carcinogenic PAHs in the study area. High correlation coefficient (R2 > 0.75 for most PAHs) between the measured and predicted concentrations of PAHs suggests for the applicability of the PCA/APCS receptor modeling approach for estimation of source contribution to the PAHs in particulates.

  8. Ion source and injector development

    International Nuclear Information System (INIS)

    Curtis, C.D.

    1976-01-01

    This is a survey of low energy accelerators which inject into proton linacs. Laboratories covered include Argonne, Brookhaven, CERN, Chalk River, Fermi, ITEP, KEK, Rutherford, and Saclay. This paper emphasizes complete injector systems, comparing significant hardware features and beam performance data, including recent additions. There is increased activity now in the acceleration of polarized protons, H + and H - , and of unpolarized H - . New source development and programs for these ion beams is outlined at the end of the report. Heavy-ion sources are not included

  9. Large cryoconite aggregates on a Svalbard glacier support a diverse microbial community including ammonia-oxidizing archaea

    Science.gov (United States)

    Zarsky, Jakub D.; Stibal, Marek; Hodson, Andy; Sattler, Birgit; Schostag, Morten; Hansen, Lars H.; Jacobsen, Carsten S.; Psenner, Roland

    2013-09-01

    The aggregation of surface debris particles on melting glaciers into larger units (cryoconite) provides microenvironments for various microorganisms and metabolic processes. Here we investigate the microbial community on the surface of Aldegondabreen, a valley glacier in Svalbard which is supplied with carbon and nutrients from different sources across its surface, including colonies of seabirds. We used a combination of geochemical analysis (of surface debris, ice and meltwater), quantitative polymerase chain reactions (targeting the 16S ribosomal ribonucleic acid and amoA genes), pyrosequencing and multivariate statistical analysis to suggest possible factors driving the ecology of prokaryotic microbes on the surface of Aldegondabreen and their potential role in nitrogen cycling. The combination of high nutrient input with subsidy from the bird colonies, supraglacial meltwater flow and the presence of fine, clay-like particles supports the formation of centimetre-scale cryoconite aggregates in some areas of the glacier surface. We show that a diverse microbial community is present, dominated by the cyanobacteria, Proteobacteria, Bacteroidetes, and Actinobacteria, that are well-known in supraglacial environments. Importantly, ammonia-oxidizing archaea were detected in the aggregates for the first time on an Arctic glacier.

  10. Large cryoconite aggregates on a Svalbard glacier support a diverse microbial community including ammonia-oxidizing archaea

    International Nuclear Information System (INIS)

    Zarsky, Jakub D; Sattler, Birgit; Psenner, Roland; Stibal, Marek; Schostag, Morten; Jacobsen, Carsten S; Hodson, Andy; Hansen, Lars H

    2013-01-01

    The aggregation of surface debris particles on melting glaciers into larger units (cryoconite) provides microenvironments for various microorganisms and metabolic processes. Here we investigate the microbial community on the surface of Aldegondabreen, a valley glacier in Svalbard which is supplied with carbon and nutrients from different sources across its surface, including colonies of seabirds. We used a combination of geochemical analysis (of surface debris, ice and meltwater), quantitative polymerase chain reactions (targeting the 16S ribosomal ribonucleic acid and amoA genes), pyrosequencing and multivariate statistical analysis to suggest possible factors driving the ecology of prokaryotic microbes on the surface of Aldegondabreen and their potential role in nitrogen cycling. The combination of high nutrient input with subsidy from the bird colonies, supraglacial meltwater flow and the presence of fine, clay-like particles supports the formation of centimetre-scale cryoconite aggregates in some areas of the glacier surface. We show that a diverse microbial community is present, dominated by the cyanobacteria, Proteobacteria, Bacteroidetes, and Actinobacteria, that are well-known in supraglacial environments. Importantly, ammonia-oxidizing archaea were detected in the aggregates for the first time on an Arctic glacier. (letter)

  11. Large cryoconite aggregates on a Svalbard glacier support a diverse microbial community including ammonia-oxidizing archaea

    Energy Technology Data Exchange (ETDEWEB)

    Zarsky, Jakub D; Sattler, Birgit; Psenner, Roland [Institute of Ecology, University of Innsbruck, Innsbruck (Austria); Stibal, Marek; Schostag, Morten; Jacobsen, Carsten S [Department of Geochemistry, Geological Survey of Denmark and Greenland (GEUS), Copenhagen (Denmark); Hodson, Andy [Department of Geography, University of Sheffield, Sheffield (United Kingdom); Hansen, Lars H, E-mail: j.zarsky@gmail.com [Department of Biology, University of Copenhagen, Copenhagen (Denmark)

    2013-09-15

    The aggregation of surface debris particles on melting glaciers into larger units (cryoconite) provides microenvironments for various microorganisms and metabolic processes. Here we investigate the microbial community on the surface of Aldegondabreen, a valley glacier in Svalbard which is supplied with carbon and nutrients from different sources across its surface, including colonies of seabirds. We used a combination of geochemical analysis (of surface debris, ice and meltwater), quantitative polymerase chain reactions (targeting the 16S ribosomal ribonucleic acid and amoA genes), pyrosequencing and multivariate statistical analysis to suggest possible factors driving the ecology of prokaryotic microbes on the surface of Aldegondabreen and their potential role in nitrogen cycling. The combination of high nutrient input with subsidy from the bird colonies, supraglacial meltwater flow and the presence of fine, clay-like particles supports the formation of centimetre-scale cryoconite aggregates in some areas of the glacier surface. We show that a diverse microbial community is present, dominated by the cyanobacteria, Proteobacteria, Bacteroidetes, and Actinobacteria, that are well-known in supraglacial environments. Importantly, ammonia-oxidizing archaea were detected in the aggregates for the first time on an Arctic glacier. (letter)

  12. Assessing the Applicability of Currently Available Methods for Attributing Foodborne Disease to Sources, Including Food and Food Commodities

    DEFF Research Database (Denmark)

    Pires, Sara Monteiro

    2013-01-01

    on the public health question being addressed, on the data requirements, on advantages and limitations of the method, and on the data availability of the country or region in question. Previous articles have described available methods for source attribution, but have focused only on foodborne microbiological...

  13. Quantitative Metrics for Generative Justice: Graphing the Value of Diversity

    Directory of Open Access Journals (Sweden)

    Brian Robert Callahan

    2016-12-01

    Full Text Available Scholarship utilizing the Generative Justice framework has focused primarily on qualitative data collection and analysis for its insights. This paper introduces a quantitative data measurement, contributory diversity, which can be used to enhance the analysis of ethical dimensions of value production under the Generative Justice lens. It is well known that the identity of contributors—gender, ethnicity, and other categories—is a key issue for social justice in general. Using the example of Open Source Software communities, we note that that typical diversity measures, focusing exclusively on workforce demographics, can fail to fully illuminate issues in value generation. Using Shannon’s entropy measure, we offer an alternative metric which combines the traditional assessment of demographics with a measure of value generation. This mapping allows for previously unacknowledged contributions to be recognized, and can avoid some of the ways in which exclusionary practices are obscured. We offer contributory diversity not as the single optimal metric, but rather as a call for others to begin investigating the possibilities for quantitative measurements of the communities and value flows that are studied using the Generative Justice framework. 

  14. Attributing Methane and Carbon Dioxide Emissions from Anthropogenic and Natural Sources Using AVIRIS-NG

    Science.gov (United States)

    Thorpe, A. K.; Frankenberg, C.; Thompson, D. R.; Duren, R. M.; Aubrey, A. D.; Bue, B. D.; Green, R. O.; Gerilowski, K.; Krings, T.; Borchardt, J.; Kort, E. A.; Sweeney, C.; Conley, S. A.; Roberts, D. A.; Dennison, P. E.; Ayasse, A.

    2016-12-01

    Imaging spectrometers like the next generation Airborne Visible/Infrared Imaging Spectrometer (AVIRIS-NG) can map large regions with the high spatial resolution necessary to resolve methane (CH4) and carbon dioxide (CO2) emissions. This capability is aided by real time detection and geolocation of gas plumes, permitting unambiguous identification of individual emission source locations and communication to ground teams for rapid follow up. We present results from AVIRIS-NG flight campaigns in the Four Corners region (Colorado and New Mexico) and the San Joaquin Valley (California). Over three hundred plumes were observed, reflecting emissions from anthropogenic and natural sources. Examples of plumes will be shown for a number of sources, including CH4 from well completions, gas processing plants, tanks, pipeline leaks, natural seeps, and CO2 from power plants. Despite these promising results, an imaging spectrometer built exclusively for quantitative mapping of gas plumes would have improved sensitivity compared to AVIRIS-NG. For example, an instrument providing a 1 nm spectral sampling (2,000-2,400 micron) would permit mapping CH4, CO2, H2O, CO, and N2O from more diffuse sources using both airborne and orbital platforms. The ability to identify emission sources offers the potential to constrain regional greenhouse gas budgets and improve partitioning between anthropogenic and natural emission sources. Because the CH4 lifetime is only about 9 years and CH4 has a Global Warming Potential 86 times that of CO2 for a 20 year time interval, mitigating these emissions is a particularly cost-effective approach to reduce overall atmospheric radiative forcing. Fig. 1. True color image subset with superimposed gas plumes showing concentrations in ppmm. Left: AVIRIS-NG observed CH4 plumes from natural gas processing plant extending over 500 m downwind of multiple emissions sources. Right: Multiple CO2 plumes observed from coal-fired power plant.

  15. Chandra Source Catalog: User Interfaces

    Science.gov (United States)

    Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.

    2010-03-01

    The CSCview data mining interface is available for browsing the Chandra Source Catalog (CSC) and downloading tables of quality-assured source properties and data products. Once the desired source properties and search criteria are entered into the CSCview query form, the resulting source matches are returned in a table along with the values of the requested source properties for each source. (The catalog can be searched on any source property, not just position.) At this point, the table of search results may be saved to a text file, and the available data products for each source may be downloaded. CSCview save files are output in RDB-like and VOTable format. The available CSC data products include event files, spectra, lightcurves, and images, all of which are processed with the CIAO software. CSC data may also be accessed non-interactively with Unix command-line tools such as cURL and Wget, using ADQL 2.0 query syntax. In fact, CSCview features a separate ADQL query form for those who wish to specify this type of query within the GUI. Several interfaces are available for learning if a source is included in the catalog (in addition to CSCview): 1) the CSC interface to Sky in Google Earth shows the footprint of each Chandra observation on the sky, along with the CSC footprint for comparison (CSC source properties are also accessible when a source within a Chandra field-of-view is clicked); 2) the CSC Limiting Sensitivity online tool indicates if a source at an input celestial location was too faint for detection; 3) an IVOA Simple Cone Search interface locates all CSC sources within a specified radius of an R.A. and Dec.; and 4) the CSC-SDSS cross-match service returns the list of sources common to the CSC and SDSS, either all such sources or a subset based on search criteria.

  16. 75 FR 19302 - Radiation Sources on Army Land

    Science.gov (United States)

    2010-04-14

    ... possession of ionizing radiation sources by non-Army agencies (including their civilian contractors) on an... radiation sources on Army land. The Army requires Non-Army agencies (including their civilian contractors... ionizing radiation sources on an Army Installation. For the purpose of this proposed rule, ``ionizing...

  17. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    Science.gov (United States)

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  18. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  19. Source apportionment of fine organic aerosol in Mexico City during the MILAGRO experiment 2006

    Directory of Open Access Journals (Sweden)

    E. A. Stone

    2008-03-01

    Full Text Available Organic carbon (OC comprises a large fraction of fine particulate matter (PM2.5 in Mexico City. Daily and select 12-h PM2.5 samples were collected in urban and peripheral sites in Mexico City from 17–30 March 2006. Samples were analyzed for OC and elemental carbon (EC using thermal-optical filter-based methods. Real-time water-soluble organic carbon (WSOC was collected at the peripheral site. Organic compounds, particularly molecular markers, were quantified by soxhlet extraction with methanol and dichloromethane, derivitization, and gas chromatography with mass spectrometric detection (GCMS. A chemical mass balance model (CMB based on molecular marker species was used to determine the relative contribution of major sources to ambient OC. Motor vehicles, including diesel and gasoline, consistently accounted for 49% of OC in the urban area and 32% on the periphery. The daily contribution of biomass burning to OC was highly variable, and ranged from 5–26% at the urban site and 7–39% at the peripheral site. The remaining OC unapportioned to primary sources showed a strong correlation with WSOC and was considered to be secondary in nature. Comparison of temporally resolved OC showed that contributions from primary aerosol sources during daylight hours were not significantly different from nighttime. This study provides quantitative understanding of the important sources of OC during the MILAGRO 2006 field campaign.

  20. Orphan sources in Slovenia

    International Nuclear Information System (INIS)

    Janzekovic, H.; Cesarek, J.

    2005-01-01

    For decades the international standards and requirements postulate severe control over all lifecycle phases of radioactive sources in order to prevent risks associated with exposure of people and the environment. Despite this fact the orphan sources became a serious problem as a consequence of enlargement of economic transactions in many countries in Europe as well as in the world. The countries as well as international organisations, aware of this emerging problem, are trying to gain control over orphan sources using different approaches. These approaches include control over sources before they could become orphan sources. In addition, countries are also developing action plans in case that an orphan source could be found. The problems related to orphan sources in Slovenia is discussed based on the case studies from the last years. While in the nineties of the last century just a few cases of orphan sources were identified their number has increased substantially since 2003. The paper discusses the general reasons for the phenomena of orphan sources as well as the experience related to regaining control over orphan sources. (author)