WorldWideScience

Sample records for corrected satellite-based quantitative

  1. Evaluation of quantitative satellite-based retrievals of volcanic ash clouds

    Science.gov (United States)

    Schneider, D. J.; Pavolonis, M. J.; Bojinski, S.; Siddans, R.; Thomas, G.

    2015-12-01

    Volcanic ash clouds are a serious hazard to aviation, and mitigation requires a robust system of volcano monitoring, eruption detection, characterization of cloud properties, forecast of cloud movement, and communication of warnings. Several research groups have developed quantitative satellite-based volcanic ash products and some of these are in operational use by Volcanic Ash Advisory Centers around the world to aid in characterizing cloud properties and forecasting regions of ash hazard. The algorithms applied to the satellite data utilize a variety of techniques, and thus produce results that differ. The World Meteorological Organization has recently sponsored an intercomparison study of satellite-based retrievals with four goals: 1) to establish a validation protocol for satellite-based volcanic ash products, 2) to quantify and understand differences in products, 3) to develop best practices, and 4) to standardize volcanic cloud geophysical parameters. Six volcanic eruption cases were considered in the intercomparison: Eyjafallajökull, Grimsvötn, Kelut, Kirishimayama, Puyehue-Cordón Caulle, and Sarychev Peak. Twenty-four algorithms were utilized, which retrieved parameters including: ash cloud top height, ash column mass loading, ash effective radius, and ash optical depth at visible and thermal-infrared wavelengths. Results were compared to space-based, airborne, and ground-based lidars; complementary satellite retrievals; and manual "expert evaluation" of ash extent. The intercomparison results will feed into the International Civil Aviation Organization "Roadmap for International Airways Volcano Watch", which integrates volcanic meteorological information into decision support systems for aircraft operations.

  2. Evaluation of Bias Correction Method for Satellite-Based Rainfall Data

    Directory of Open Access Journals (Sweden)

    Haris Akram Bhatti

    2016-06-01

    Full Text Available With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA Climate Prediction Centre (CPC morphing technique (CMORPH satellite rainfall product (CMORPH in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW sizes and for sequential windows (SW’s of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE. To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r and standard deviation (SD. Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.

  3. Correcting satellite-based precipitation products from SMOS soil moisture data assimilation using two models of different complexity

    Science.gov (United States)

    Román-Cascón, Carlos; Pellarin, Thierry; Gibon, François

    2017-04-01

    Real-time precipitation information at the global scale is quite useful information for many applications. However, satellite-based precipitation products in real time are known to be biased from real values observed in situ. On the other hand, the information about precipitation contained in soil moisture data can be very useful to improve precipitation estimation, since the evolution of this variable is highly influenced by the amount of rainfall at a certain area after a rain event. In this context, the soil moisture data from the Soil Moisture Ocean Salinity (SMOS) satellite is used to correct the precipitation provided by real-time satellite-based products such as CMORPH, TRMM-3B42RT or PERSIANN. In this work, we test an assimilation algorithm based on the data assimilation of SMOS measurements in two models of different complexity: a simple hydrological model (Antecedent Precipitation Index (API)) and a state-of-the-art complex land-surface model (Surface Externalisée (SURFEX)). We show how the assimilation technique, based on a particle filter method, leads to the improvement of correlation and root mean squared error (RMSE) of precipitation estimates, with slightly better results for the simpler (and less expensive computationally) API model. This methodology has been evaluated for six years in ten sites around the world with different features, showing the limitations of the methodology in regions affected by mountainous terrain or by high radio-frequency interferences (RFI), which notably affect the quality of the soil moisture retrievals from brightness temperatures by SMOS. The presented results are promising for a potential near-real time application at the global scale.

  4. Correcting rainfall using satellite-based surface soil moisture retrievals: The Soil Moisture Analysis Rainfall Tool (SMART)

    Science.gov (United States)

    Crow, W. T.; van den Berg, M. J.; Huffman, G. J.; Pellarin, T.

    2011-08-01

    Recently, Crow et al. (2009) developed an algorithm for enhancing satellite-based land rainfall products via the assimilation of remotely sensed surface soil moisture retrievals into a water balance model. As a follow-up, this paper describes the benefits of modifying their approach to incorporate more complex data assimilation and land surface modeling methodologies. Specific modifications improving rainfall estimates are assembled into the Soil Moisture Analysis Rainfall Tool (SMART), and the resulting algorithm is applied outside the contiguous United States for the first time, with an emphasis on West African sites instrumented as part of the African Monsoon Multidisciplinary Analysis experiment. Results demonstrate that the SMART algorithm is superior to the Crow et al. baseline approach and is capable of broadly improving coarse-scale rainfall accumulations measurements with low risk of degradation. Comparisons with existing multisensor, satellite-based precipitation data products suggest that the introduction of soil moisture information from the Advanced Microwave Scanning Radiometer via SMART provides as much coarse-scale (3 day, 1°) rainfall accumulation information as thermal infrared satellite observations and more information than monthly rain gauge observations in poorly instrumented regions.

  5. Quantitative SPECT reconstruction using CT-derived corrections

    Science.gov (United States)

    Willowson, Kathy; Bailey, Dale L.; Baldock, Clive

    2008-06-01

    A method for achieving quantitative single-photon emission computed tomography (SPECT) based upon corrections derived from x-ray computed tomography (CT) data is presented. A CT-derived attenuation map is used to perform transmission-dependent scatter correction (TDSC) in conjunction with non-uniform attenuation correction. The original CT data are also utilized to correct for partial volume effects in small volumes of interest. The accuracy of the quantitative technique has been evaluated with phantom experiments and clinical lung ventilation/perfusion SPECT/CT studies. A comparison of calculated values with the known total activities and concentrations in a mixed-material cylindrical phantom, and in liver and cardiac inserts within an anthropomorphic torso phantom, produced accurate results. The total activity in corrected ventilation-subtracted perfusion images was compared to the calibrated injected dose of [99mTc]-MAA (macro-aggregated albumin). The average difference over 12 studies between the known and calculated activities was found to be -1%, with a range of ±7%.

  6. Error corrections for quantitative thermal neutron computed tomography

    Science.gov (United States)

    Shi, Liang

    A state-of-the art, two mirror reflection, combination of a Li-6 scintillation screen and a cooled CCD camera high spatial resolution neutron radioscopy imaging system was designed and developed in the RSEC at Penn State. Radiation shielding was applied to the imaging system to achieve a higher spatial resolution. Modulation Transfer Function (MTF) analysis shows that a spatial resolution of 116 microns was achieved. The imaging system was successfully applied for diagnostic measurements of hydrogen fuel cells. A quantitative neutron computed tomography NCT model was developed which confirmed the fundamental computed tomography theory. The model justified the partial volume neutron computed tomography water/ice mass evaluation technique which was designed and tested by Heller. The evaluation results of the water/ice mass using the NCT method was very close to the theoretical value. Sample and background neutron scattering effects were considered as one of the errors that influenced the accuracy of the quantitative measurement using the NCT method. The neutron scattering effect induced cupping artifacts that also contributed to the error in the measurement of water/ice mass using NCT. One method was developed to reduce the cupping artifacts in the reconstruction slice of the water/ice column. The geometric unsharpness, Ug, was demonstrated as the predominant source of error for the accuracy of the 3-D water/ice mass evaluation technique. A unique method was established to reduce the divergence neutron beam associated geometric unsharpness Ug. Compared to the de-convolution algorithm used in de-blurring the image projection, the method has the advantage of minimizing the unsharpness while keeping the degree of cupping through the water column the same. For the 3-D water/ice mass evaluation purpose, this method is a better choice for the water quantification technique error correction.

  7. Magnetic Resonance-based Motion Correction for Quantitative PET in Simultaneous PET-MR Imaging.

    Science.gov (United States)

    Rakvongthai, Yothin; El Fakhri, Georges

    2017-07-01

    Motion degrades image quality and quantitation of PET images, and is an obstacle to quantitative PET imaging. Simultaneous PET-MR offers a tool that can be used for correcting the motion in PET images by using anatomic information from MR imaging acquired concurrently. Motion correction can be performed by transforming a set of reconstructed PET images into the same frame or by incorporating the transformation into the system model and reconstructing the motion-corrected image. Several phantom and patient studies have validated that MR-based motion correction strategies have great promise for quantitative PET imaging in simultaneous PET-MR. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. 14 CFR 141.91 - Satellite bases.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Satellite bases. 141.91 Section 141.91... OTHER CERTIFICATED AGENCIES PILOT SCHOOLS Operating Rules § 141.91 Satellite bases. The holder of a... assistant chief instructor is designated for each satellite base, and that assistant chief instructor is...

  9. Improving quantitative dosimetry in (177)Lu-DOTATATE SPECT by energy window-based scatter corrections

    DEFF Research Database (Denmark)

    de Nijs, Robin; Lagerburg, Vera; Klausen, Thomas L

    2014-01-01

    and the activity, which depends on the collimator type, the utilized energy windows and the applied scatter correction techniques. In this study, energy window subtraction-based scatter correction methods are compared experimentally and quantitatively. MATERIALS AND METHODS: (177)Lu SPECT images of a phantom...... technique, the measured ratio was close to the real ratio, and the differences between spheres were small. CONCLUSION: For quantitative (177)Lu imaging MEGP collimators are advised. Both energy peaks can be utilized when the ESSE correction technique is applied. The difference between the calculated...

  10. Satellite-based laser windsounder

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, J.F.; Czuchlewski, S.J.; Quick, C.R. [and others

    1997-08-01

    This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project`s primary objective is to determine the technical feasibility of using satellite-based laser wind sensing systems for detailed study of winds, aerosols, and particulates around and downstream of suspected proliferation facilities. Extensive interactions with the relevant operational organization resulted in enthusiastic support and useful guidance with respect to measurement requirements and priorities. Four candidate wind sensing techniques were evaluated, and the incoherent Doppler technique was selected. A small satellite concept design study was completed to identify the technical issues inherent in a proof-of-concept small satellite mission. Use of a Mach-Zehnder interferometer instead of a Fabry-Perot would significantly simplify the optical train and could reduce weight, and possibly power, requirements with no loss of performance. A breadboard Mach-Zehnder interferometer-based system has been built to verify these predictions. Detailed plans were made for resolving other issues through construction and testing of a ground-based lidar system in collaboration with the University of Wisconsin, and through numerical lidar wind data assimilation studies.

  11. Calibration and correction procedure for quantitative out-of-plane shearography

    OpenAIRE

    Zastavnik, Filip; Pyl, Lincy; Gu, Jun; Sol, Hugo; Kersemans, Mathias; van Paepegem, Wim

    2015-01-01

    Quantitative shearography applications continue to gain practical importance. However, a study of the errors inherent in shearography measurements, related to calibration of the instrument and correction of the results, is most often lacking. This paper proposes a calibration and correction procedure for the out-of-plane shearography with a Michelson interferometer. The calibration is based on the shearography measurement of known rigid-body rotations of a flat plate and accounts for the loca...

  12. Satellite-Based Quantum Communications

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard J [Los Alamos National Laboratory; Nordholt, Jane E [Los Alamos National Laboratory; McCabe, Kevin P [Los Alamos National Laboratory; Newell, Raymond T [Los Alamos National Laboratory; Peterson, Charles G [Los Alamos National Laboratory

    2010-09-20

    Single-photon quantum communications (QC) offers the attractive feature of 'future proof', forward security rooted in the laws of quantum physics. Ground based quantum key distribution (QKD) experiments in optical fiber have attained transmission ranges in excess of 200km, but for larger distances we proposed a methodology for satellite-based QC. Over the past decade we have devised solutions to the technical challenges to satellite-to-ground QC, and we now have a clear concept for how space-based QC could be performed and potentially utilized within a trusted QKD network architecture. Functioning as a trusted QKD node, a QC satellite ('QC-sat') could deliver secret keys to the key stores of ground-based trusted QKD network nodes, to each of which multiple users are connected by optical fiber or free-space QC. A QC-sat could thereby extend quantum-secured connectivity to geographically disjoint domains, separated by continental or inter-continental distances. In this paper we describe our system concept that makes QC feasible with low-earth orbit (LEO) QC-sats (200-km-2,000-km altitude orbits), and the results of link modeling of expected performance. Using the architecture that we have developed, LEO satellite-to-ground QKD will be feasible with secret bit yields of several hundred 256-bit AES keys per contact. With multiple ground sites separated by {approx} 100km, mitigation of cloudiness over any single ground site would be possible, potentially allowing multiple contact opportunities each day. The essential next step is an experimental QC-sat. A number of LEO-platforms would be suitable, ranging from a dedicated, three-axis stabilized small satellite, to a secondary experiment on an imaging satellite. to the ISS. With one or more QC-sats, low-latency quantum-secured communications could then be provided to ground-based users on a global scale. Air-to-ground QC would also be possible.

  13. Effects of scatter and attenuation correction on quantitative assessment of regional cerebral blood flow with SPECT.

    Science.gov (United States)

    Iida, H; Narita, Y; Kado, H; Kashikura, A; Sugawara, S; Shoji, Y; Kinoshita, T; Ogawa, T; Eberl, S

    1998-01-01

    Appropriate corrections for scatter and attenuation correction are prerequisites for quantitative SPECT studies. However, in most cerebral SPECT studies, uniform attenuation in the head is assumed, and scatter is usually neglected. This study evaluated the effect of attenuation correction and scatter correction on quantitative values and image contrast. Studies were performed in six normal volunteers (ages 22-26 yr) following intravenous 123I-IMP administration using a rotating, dual-head gamma camera. A transmission scan was acquired with a 99mTc rod source (74 MBq) placed at the focus of a symmetrical fanbeam collimator. Data were reconstructed using two attenuation coefficient (mu) maps: quantitative mu map from the transmission scan and a uniform mu map generated by edge detection of the reconstructed images. Narrow and broad beam mu values were used with and without scatter correction, respectively. Scatter was corrected with transmission-dependent convolution subtraction and triple-energy window techniques. Quantitative rCBF images were calculated by the previously validated IMP-autoradiographic technique, and they were compared with those obtained by (15)O-water and PET. SPECT and PET images were registered to MRI studies, and rCBF values were compared in 39 ROIs selected on MRI. Clear differences were observed in rCBF images between the measured and constant mu maps in the lower slices due to the airways and in the higher slices due to increased skull attenuation. However, differences were white matter regions by 10%-20% after scatter correction, increasing gray-to-white ratio to be close to that of PET measurement. The rCBF values from the two scatter correction were not significantly different, but the triple-energy window technique suffered from increased noise. After scatter correction, rCBF values were in good agreement with those measured by PET. This study shows little loss in accuracy results from assuming uniform mu map. However, scatter correction

  14. Recidivism, Disciplinary History, and Institutional Adjustment: A Quantitative Study Examining Correctional Education Programs

    Science.gov (United States)

    Flamer, Eric, Sr.

    2012-01-01

    Establishing college-degree programs for prison inmates is an evidence-based effective instructional strategy in reducing recidivism. Evaluating academic arenas as a resource to improve behavior and levels of functioning within correctional facilities is a necessary component of inmate academic programs. The purpose of this quantitative,…

  15. Calibration and correction procedure for quantitative out-of-plane shearography

    Science.gov (United States)

    Zastavnik, Filip; Pyl, Lincy; Gu, Jun; Sol, Hugo; Kersemans, Mathias; Van Paepegem, Wim

    2015-04-01

    Quantitative shearography applications continue to gain practical importance. However, a study of the errors inherent in shearography measurements, related to calibration of the instrument and correction of the results, is most often lacking. This paper proposes a calibration and correction procedure for the out-of-plane shearography with a Michelson interferometer. The calibration is based on the shearography measurement of known rigid-body rotations of a flat plate and accounts for the local variability of the shearing distance. The correction procedure further compensates for the variability of the sensitivity vector and separates the two out-of-plane deformation gradients when they are coupled in the measurement. The correction procedure utilizes two shearography measurements of the same experiment with distinct shearing distances. The effectiveness of the proposed calibration procedure is demonstrated in the case of a static deformation of a centrally loaded plate, where the discrepancy between experimental and finite element analysis results is minimized.

  16. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies

    Energy Technology Data Exchange (ETDEWEB)

    Silva-Rodríguez, Jesús, E-mail: jesus.silva.rodriguez@sergas.es; Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es [Fundación Ramón Domínguez, Santiago de Compostela, Galicia (Spain); Servicio de Medicina Nuclear, Complexo Hospitalario Universidade de Santiago de Compostela (USC), 15782, Galicia (Spain); Grupo de Imaxe Molecular, Instituto de Investigación Sanitarias (IDIS), Santiago de Compostela, 15706, Galicia (Spain); Sánchez, Manuel; Mosquera, Javier; Luna-Vega, Víctor [Servicio de Radiofísica y Protección Radiológica, Complexo Hospitalario Universidade de Santiago de Compostela (USC), 15782, Galicia (Spain); Cortés, Julia; Garrido, Miguel [Servicio de Medicina Nuclear, Complexo Hospitalario Universitario de Santiago de Compostela, 15706, Galicia, Spain and Grupo de Imaxe Molecular, Instituto de Investigación Sanitarias (IDIS), Santiago de Compostela, 15706, Galicia (Spain); Pombar, Miguel [Servicio de Radiofísica y Protección Radiológica, Complexo Hospitalario Universitario de Santiago de Compostela, 15706, Galicia (Spain); Ruibal, Álvaro [Servicio de Medicina Nuclear, Complexo Hospitalario Universidade de Santiago de Compostela (USC), 15782, Galicia (Spain); Grupo de Imaxe Molecular, Instituto de Investigación Sanitarias (IDIS), Santiago de Compostela, 15706, Galicia (Spain); Fundación Tejerina, 28003, Madrid (Spain)

    2014-05-15

    Purpose: Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. Methods: One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manual ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Results: Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. Conclusions: The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.

  17. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Quantitative Evaluation of 2 Scatter-Correction Techniques for 18F-FDG Brain PET/MRI in Regard to MR-Based Attenuation Correction.

    Science.gov (United States)

    Teuho, Jarmo; Saunavaara, Virva; Tolvanen, Tuula; Tuokkola, Terhi; Karlsson, Antti; Tuisku, Jouni; Teräs, Mika

    2017-10-01

    In PET, corrections for photon scatter and attenuation are essential for visual and quantitative consistency. MR attenuation correction (MRAC) is generally conducted by image segmentation and assignment of discrete attenuation coefficients, which offer limited accuracy compared with CT attenuation correction. Potential inaccuracies in MRAC may affect scatter correction, because the attenuation image (μ-map) is used in single scatter simulation (SSS) to calculate the scatter estimate. We assessed the impact of MRAC to scatter correction using 2 scatter-correction techniques and 3 μ-maps for MRAC. Methods: The tail-fitted SSS (TF-SSS) and a Monte Carlo-based single scatter simulation (MC-SSS) algorithm implementations on the Philips Ingenuity TF PET/MR were used with 1 CT-based and 2 MR-based μ-maps. Data from 7 subjects were used in the clinical evaluation, and a phantom study using an anatomic brain phantom was conducted. Scatter-correction sinograms were evaluated for each scatter correction method and μ-map. Absolute image quantification was investigated with the phantom data. Quantitative assessment of PET images was performed by volume-of-interest and ratio image analysis. Results: MRAC did not result in large differences in scatter algorithm performance, especially with TF-SSS. Scatter sinograms and scatter fractions did not reveal large differences regardless of the μ-map used. TF-SSS showed slightly higher absolute quantification. The differences in volume-of-interest analysis between TF-SSS and MC-SSS were 3% at maximum in the phantom and 4% in the patient study. Both algorithms showed excellent correlation with each other with no visual differences between PET images. MC-SSS showed a slight dependency on the μ-map used, with a difference of 2% on average and 4% at maximum when a μ-map without bone was used. Conclusion: The effect of different MR-based μ-maps on the performance of scatter correction was minimal in non-time-of-flight 18F-FDG PET

  19. Using satellite-based measurements to explore ...

    Science.gov (United States)

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently spatially averaged) measurements of atmospheric conditions to diagnose the occurrence of NPF and NPF characteristics. We demonstrate the potential for using satellite-measurements of insolation (UV), trace gas concentrations (sulfur dioxide (SO2), nitrogen dioxide (NO2), ammonia (NH3), formaldehyde (HCHO), ozone (O3)), aerosol optical properties (aerosol optical depth (AOD), Ångström exponent (AE)), and a proxy of biogenic volatile organic compound emissions (leaf area index (LAI), temperature (T)) as predictors for NPF characteristics: formation rates, growth rates, survival probabilities, and ultrafine particle (UFP) concentrations at five locations across North America. NPF at all sites is most frequent in spring, exhibits a one-day autocorrelation, and is associated with low condensational sink (AOD×AE) and HCHO concentrations, and high UV. However, there are important site-to-site variations in NPF frequency and characteristics, and in which of the predictor variables (particularly gas concentrations) significantly contribute to the explanatory power of regression models built to predict those characteristics. This finding may provide a partial explanation for the reported spatia

  20. Correction

    CERN Multimedia

    2002-01-01

    The photo on the second page of the Bulletin n°48/2002, from 25 November 2002, illustrating the article «Spanish Visit to CERN» was published with a wrong caption. We would like to apologise for this mistake and so publish it again with the correct caption.   The Spanish delegation, accompanied by Spanish scientists at CERN, also visited the LHC superconducting magnet test hall (photo). From left to right: Felix Rodriguez Mateos of CERN LHC Division, Josep Piqué i Camps, Spanish Minister of Science and Technology, César Dopazo, Director-General of CIEMAT (Spanish Research Centre for Energy, Environment and Technology), Juan Antonio Rubio, ETT Division Leader at CERN, Manuel Aguilar-Benitez, Spanish Delegate to Council, Manuel Delfino, IT Division Leader at CERN, and Gonzalo León, Secretary-General of Scientific Policy to the Minister.

  1. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E; Bowsher, J; Thomas, A S; Sakhalkar, H; Dewhirst, M; Oldham, M [Department of Radiation Oncology, Duke University Medical Center, Durham, NC (United States)

    2008-10-07

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared {approx}24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within {approx}4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the

  2. Optical distortion correction in optical coherence tomography for quantitative ocular anterior segment by three-dimensional imaging.

    Science.gov (United States)

    Ortiz, Sergio; Siedlecki, Damian; Grulkowski, Ireneusz; Remon, Laura; Pascual, Daniel; Wojtkowski, Maciej; Marcos, Susana

    2010-02-01

    A method for three-dimensional 3-D optical distortion (refraction) correction on anterior segment Optical Coherence Tomography (OCT) images has been developed. The method consists of 3-D ray tracing through the different surfaces, following denoising, segmentation of the surfaces, Delaunay representation of the surfaces, and application of fan distortion correction. The correction has been applied theoretically to realistic computer eye models, and experimentally to OCT images of: an artificial eye with a Polymethyl Methacrylate (PMMA) cornea and an intraocular lens (IOL), an enucleated porcine eye, and a human eye in vivo obtained from two OCT laboratory set-ups (time domain and spectral). Data are analyzed in terms of surface radii of curvature and asphericity. Comparisons are established between the reference values for the surfaces (nominal values in the computer model; non-contact profilometric measurements for the artificial eye; Scheimpflug imaging for the real eyes in vivo and vitro). The results from the OCT data were analyzed following the conventional approach of dividing the optical path by the refractive index, after application of 2-D optical correction, and 3-D optical correction (in all cases after fan distortion correction). The application of 3-D optical distortion correction increased significantly both the accuracy of the radius of curvature estimates and particularly asphericity of the surfaces, with respect to conventional methods of OCT image analysis. We found that the discrepancies of the radii of curvature estimates from 3-D optical distortion corrected OCT images are less than 1% with respect to nominal values. Optical distortion correction in 3-D is critical for quantitative analysis of OCT anterior segment imaging, and allows accurate topography of the internal surfaces of the eye.

  3. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  4. The quantitative role of flexor sheath incision in correcting Dupuytren proximal interphalangeal joint contractures.

    Science.gov (United States)

    Blazar, P E; Floyd, E W; Earp, B E

    2016-07-01

    Controversy exists regarding intra-operative treatment of residual proximal interphalangeal joint contractures after Dupuytren's fasciectomy. We test the hypothesis that a simple release of the digital flexor sheath can correct residual fixed flexion contracture after subtotal fasciectomy. We prospectively enrolled 19 patients (22 digits) with Dupuytren's contracture of the proximal interphalangeal joint. The average pre-operative extension deficit of the proximal interphalangeal joints was 58° (range 30-90). The flexion contracture of the joint was corrected to an average of 28° after fasciectomy. In most digits (20 of 21), subsequent incision of the flexor sheath further corrected the contracture by an average of 23°, resulting in correction to an average flexion contracture of 4.7° (range 0-40). Our results support that contracture of the tendon sheath is a contributor to Dupuytren's contracture of the joint and that sheath release is a simple, low morbidity addition to correct Dupuytren's contractures of the proximal interphalangeal joint. Additional release of the proximal interphalangeal joint after fasciectomy, after release of the flexor sheath, is not necessary in many patients. IV (Case Series, Therapeutic). © The Author(s) 2015.

  5. A novel 3D absorption correction method for quantitative EDX-STEM tomography

    Energy Technology Data Exchange (ETDEWEB)

    Burdet, Pierre, E-mail: pierre.burdet@a3.epfl.ch [Department of Materials Science and Metallurgy, University of Cambridge, Charles Babbage Road 27, Cambridge CB3 0FS, Cambridgeshire (United Kingdom); Saghi, Z. [Department of Materials Science and Metallurgy, University of Cambridge, Charles Babbage Road 27, Cambridge CB3 0FS, Cambridgeshire (United Kingdom); Filippin, A.N.; Borrás, A. [Nanotechnology on Surfaces Laboratory, Materials Science Institute of Seville (ICMS), CSIC-University of Seville, C/ Americo Vespucio 49, 41092 Seville (Spain); Midgley, P.A. [Department of Materials Science and Metallurgy, University of Cambridge, Charles Babbage Road 27, Cambridge CB3 0FS, Cambridgeshire (United Kingdom)

    2016-01-15

    This paper presents a novel 3D method to correct for absorption in energy dispersive X-ray (EDX) microanalysis of heterogeneous samples of unknown structure and composition. By using STEM-based tomography coupled with EDX, an initial 3D reconstruction is used to extract the location of generated X-rays as well as the X-ray path through the sample to the surface. The absorption correction needed to retrieve the generated X-ray intensity is then calculated voxel-by-voxel estimating the different compositions encountered by the X-ray. The method is applied to a core/shell nanowire containing carbon and oxygen, two elements generating highly absorbed low energy X-rays. Absorption is shown to cause major reconstruction artefacts, in the form of an incomplete recovery of the oxide and an erroneous presence of carbon in the shell. By applying the correction method, these artefacts are greatly reduced. The accuracy of the method is assessed using reference X-ray lines with low absorption. - Highlights: • A novel 3D absorption correction method is proposed for 3D EDX-STEM tomography. • The absorption of X-rays along the path to the surface is calculated voxel-by-voxel. • The method is applied on highly absorbed X-rays emitted from a core/shell nanowire. • Absorption is shown to cause major artefacts in the reconstruction. • Using the absorption correction method, the reconstruction artefacts are greatly reduced.

  6. A Movable Phantom Design for Quantitative Evaluation of Motion Correction Studies on High Resolution PET Scanners

    DEFF Research Database (Denmark)

    Olesen, Oline Vinter; Svarer, C.; Sibomana, M.

    2010-01-01

    maximization algorithm with modeling of the point spread function (3DOSEM-PSF), and they were corrected for motions based on external tracking information using the Polaris Vicra real-time stereo motion-tracking system. The new automatic, movable phantom has a robust design and is a potential quality...

  7. Quantitative Measurement of Soot Concentration by Two-Wavelength Correction of Laser-Induced Incandescence Signals

    Energy Technology Data Exchange (ETDEWEB)

    Jung, J. S. [Korea Institute of Science and Technology, Seoul (Korea, Republic of)

    1997-05-01

    To quantify the L II signals from soot particle of flames in diesel engine cylinder, a new method has been proposed for correcting L II signal attenuated by soot particles between the measuring point and the detector. It has been verified by an experiment on a laminar jet ethylene-air diffusion flame. Being proportional to the attenuation, the ratio of L II signal at two different detection wavelengths can be used to correct the measured L II signal and obtain the unattenuated L II signal, from which the soot volume fraction in the flame can be estimated. Both the 1064-nm and frequency-doubled 532-nm beams from the Nd : YAG laser are used. Single-shot, one-dimensional(1-D) line images are recorded on the intensified CCD camera, with the rectangular-profile laser beam using 1-mm-diameter pinhole. Two broadband optical interference filters having the center wavelengths of 647 nm and 400nm respectively and a bandwidth of 10 nm are used. This two wavelength correction has been applied to the ethylene-air coannular laminar diffusion flame, previously studied on soot formation by the laser extinction method in this laboratory. The results by the L II measurement technique and the conventional laser extinction method at the height of 40 nm above the jet exit agreed well with each other except around outside of the peaks of soot concentration, where the soot concentration was relatively high and resulting attenuation of the L II signal was large. The radial profile shape of soot concentration was not changed a lot, but the absolute value of the soot volume fraction around outside edge changed from 4 ppm to 6.5 ppm at r=2.8 mm after correction. This means that the attenuation of L II signal was approximately 40% at this point, which is higher than the average attenuation rate of this flame, 10 {approx} 15%. (author). 15 refs., 8 figs.

  8. Disturbed Intracardiac Flow Organization After Atrioventricular Septal Defect Correction as Assessed With 4D Flow Magnetic Resonance Imaging and Quantitative Particle Tracing

    NARCIS (Netherlands)

    Calkoen, Emmeline E.; de Koning, Patrick J. H.; Blom, Nico A.; Kroft, Lucia J. M.; de Roos, Albert; Wolterbeek, Ron; Roest, Arno A. W.; Westenberg, Jos J. M.

    2015-01-01

    Objectives Four-dimensional (3 spatial directions and time) velocity-encoded flow magnetic resonance imaging with quantitative particle tracing analysis allows assessment of left ventricular (LV) blood flow organization. Corrected atrioventricular septal defect (AVSD) patients have an abnormal left

  9. Satellite-based AIS and its Comparison with LRIT

    Directory of Open Access Journals (Sweden)

    Yuli Chen

    2014-06-01

    Full Text Available The satellite-based Automatic Identification System (AIS system has continuously been developed by the shipping industry in recent years. This paper introduces the satellite-based AIS including the concept, the system structure, the development and its new applications. The Long Range Identification and Tracking (LRIT system, which is mandatory to be required on certain classes of ships engaged on international voyages to report their position at least every six hours using onboard communication means however, has similar kind of function of ship monitoring with the satellite-based AIS system. Based on the basic introduction of the LRIT system, this paper presents the comprehensive comparison between satellite-based AIS and LRIT in terms of the ship’s cost, the communication scheme, the monitoring coverage, the information details and the information creditability. The conclusion that the satellite-based AIS should be encouraged to effectively play a complement role to the LRIT system is advanced in the paper.

  10. A novel calibration strategy based on background correction for quantitative circular dichroism spectroscopy.

    Science.gov (United States)

    Zuo, Qi; Xiong, Shun; Chen, Zeng-Ping; Chen, Yao; Yu, Ru-Qin

    2017-11-01

    When using circular dichroism (CD) spectroscopy for quantitative analysis, the samples to be analyzed must be free of light-absorbing interferences. However, in real-world samples, the presence of background absorbers is practically unavoidable. The difference in the matrices between the real-world samples to be analyzed and the standard samples (on which either univariate or multivariate calibration model was built) would result in systematic errors in the quantification results of CD spectroscopy. In this contribution, a novel calibration strategy for quantitative CD spectroscopic analysis was proposed. The main idea of the proposed calibration strategy is to project the CD spectra of both the standard samples and the real-world sample to be analyzed onto a projection space orthogonal to the space spanned by the background CD spectrum of the real-world sample and then build a multivariate calibration model on the transformed CD spectra of the standard samples. The performance of the proposed calibration strategy was tested and compared with conventional univariate and multivariate calibration methods in the quantification of Pb(2+) in cosmetic samples using CD spectroscopy in combination with a G-quadruplex DNAzyme (e.g. PS2.M). Experiments results revealed that the proposed calibration strategy could mitigate the influence of the difference in the matrices between the standard samples and cosmetic samples and realized quantitative analysis of Pb(2+) in cosmetic samples, with precision and accuracy comparable to atomic absorption spectroscopy. The proposed calibration strategy has the features of simplicity and effectiveness, its combination with CD spectroscopic probes can realize accurate and precise quantification of analytes in complex samples using CD spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. The Asian Correction Can Be Quantitatively Forecasted Using a Statistical Model of Fusion-Fission Processes.

    Directory of Open Access Journals (Sweden)

    Boon Kin Teh

    Full Text Available The Global Financial Crisis of 2007-2008 wiped out US$37 trillions across global financial markets, this value is equivalent to the combined GDPs of the United States and the European Union in 2014. The defining moment of this crisis was the failure of Lehman Brothers, which precipitated the October 2008 crash and the Asian Correction (March 2009. Had the Federal Reserve seen these crashes coming, they might have bailed out Lehman Brothers, and prevented the crashes altogether. In this paper, we show that some of these market crashes (like the Asian Correction can be predicted, if we assume that a large number of adaptive traders employing competing trading strategies. As the number of adherents for some strategies grow, others decline in the constantly changing strategy space. When a strategy group grows into a giant component, trader actions become increasingly correlated and this is reflected in the stock price. The fragmentation of this giant component will leads to a market crash. In this paper, we also derived the mean-field market crash forecast equation based on a model of fusions and fissions in the trading strategy space. By fitting the continuous returns of 20 stocks traded in Singapore Exchange to the market crash forecast equation, we obtain crash predictions ranging from end October 2008 to mid-February 2009, with early warning four to six months prior to the crashes.

  12. The Asian Correction Can Be Quantitatively Forecasted Using a Statistical Model of Fusion-Fission Processes.

    Science.gov (United States)

    Teh, Boon Kin; Cheong, Siew Ann

    2016-01-01

    The Global Financial Crisis of 2007-2008 wiped out US$37 trillions across global financial markets, this value is equivalent to the combined GDPs of the United States and the European Union in 2014. The defining moment of this crisis was the failure of Lehman Brothers, which precipitated the October 2008 crash and the Asian Correction (March 2009). Had the Federal Reserve seen these crashes coming, they might have bailed out Lehman Brothers, and prevented the crashes altogether. In this paper, we show that some of these market crashes (like the Asian Correction) can be predicted, if we assume that a large number of adaptive traders employing competing trading strategies. As the number of adherents for some strategies grow, others decline in the constantly changing strategy space. When a strategy group grows into a giant component, trader actions become increasingly correlated and this is reflected in the stock price. The fragmentation of this giant component will leads to a market crash. In this paper, we also derived the mean-field market crash forecast equation based on a model of fusions and fissions in the trading strategy space. By fitting the continuous returns of 20 stocks traded in Singapore Exchange to the market crash forecast equation, we obtain crash predictions ranging from end October 2008 to mid-February 2009, with early warning four to six months prior to the crashes.

  13. Voxel spread function method for correction of magnetic field inhomogeneity effects in quantitative gradient-echo-based MRI.

    Science.gov (United States)

    Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Luo, Jie; Wang, Xiaoqi

    2013-11-01

    Macroscopic magnetic field inhomogeneities adversely affect different aspects of MRI images. In quantitative MRI when the goal is to quantify biological tissue parameters, they bias and often corrupt such measurements. The goal of this article is to develop a method for correction of macroscopic field inhomogeneities that can be applied to a variety of quantitative gradient-echo-based MRI techniques. We have reanalyzed a basic theory of gradient echo MRI signal formation in the presence of background field inhomogeneities and derived equations that allow for correction of magnetic field inhomogeneity effects based on the phase and magnitude of gradient echo data. We verified our theory by mapping effective transverse relaxation rate in computer simulated, phantom, and in vivo human data collected with multi-gradient echo sequences. The proposed technique takes into account voxel spread function effects and allowed obtaining virtually free from artifacts effective transverse relaxation rate maps for all simulated, phantom and in vivo data except of the edge areas with very steep field gradients. The voxel spread function method, allowing quantification of tissue specific effective transverse relaxation rate-related tissue properties, has a potential to breed new MRI biomarkers serving as surrogates for tissue biological properties similar to longitudinal and transverse relaxation rate constants widely used in clinical and research MRI. Copyright © 2012 Wiley Periodicals, Inc.

  14. Advancing Satellite-Based Flood Prediction in Complex Terrain Using High-Resolution Numerical Weather Prediction

    Science.gov (United States)

    Zhang, X.; Anagnostou, E. N.; Nikolopoulos, E. I.; Bartsotas, N. S.

    2015-12-01

    Floods constitute one of the most significant and frequent natural hazard in mountainous regions. Satellite-based precipitation products offer in many cases the only available source of QPE. However, satellite-based QPE over complex terrain suffer from significant bias that limits their applicability for hydrologic modeling. In this work we investigate the potential of a new correction procedure, which involves the use of high-resolution numerical weather prediction (NWP) model simulations to adjust satellite QPE. Adjustment is based on the pdf matching of satellite and NWP (used as reference) precipitation distribution. The impact of correction procedure on simulating the hydrologic response is examined for 15 storm events that generated floods over the mountainous Upper Adige region of Northern Italy. Atmospheric simulations were performed at 1-km resolution from a state-of-the-art atmospheric model (RAMS/ICLAMS). The proposed error correction procedure was then applied on the widely used TRMM 3B42 satellite precipitation product and the evaluation of the correction was based on independent in situ precipitation measurements from a dense rain gauge network (1 gauge / 70 km2) available in the study area. Satellite QPE, before and after correction, are used to simulate flood response using ARFFS (Adige River Flood Forecasting System), a semi-distributed hydrologic model, which is used for operational flood forecasting in the region. Results showed that bias in satellite QPE before correction was significant and had a tremendous impact on the simulation of flood peak, however the correction procedure was able to reduce bias in QPE and therefore improve considerably the simulated flood hydrograph.

  15. Determination of avermectins by the internal standard recovery correction - high performance liquid chromatography - quantitative Nuclear Magnetic Resonance method.

    Science.gov (United States)

    Zhang, Wei; Huang, Ting; Li, Hongmei; Dai, Xinhua; Quan, Can; He, Yajuan

    2017-09-01

    Quantitative Nuclear Magnetic Resonance (qNMR) is widely used to determine the purity of organic compounds. For the compounds with lower purity especially molecular weight more than 500, qNMR is at risk of error for the purity, because the impurity peaks are likely to be incompletely separated from the peak of major component. In this study, an offline ISRC-HPLC-qNMR (internal standard recovery correction - high performance liquid chromatography - qNMR) was developed to overcome this problem. It is accurate by excluding the influence of impurity; it is low-cost by using common mobile phase; and it extends the applicable scope of qNMR. In this method, a mix solution of the sample and an internal standard was separated by HPLC with common mobile phases, and only the eluents of the analyte and the internal standard were collected in the same tube. After evaporation and re-dissolution, it was determined by qNMR. A recovery correction factor was determined by comparison of the solutions before and after these procedures. After correction, the mass fraction of analyte was constant and it was accurate and precise, even though the sample loss varied during these procedures, or even in bad resolution of HPLC. Avermectin B1a with the purity of ~93% and the molecular weight of 873 was analyzed. Moreover, the homologues of avermectin B1a were determined based on the identification and quantitative analysis by tandem mass spectrometry and HPLC, and the results were consistent with the results of traditional mass balance method. The result showed that the method could be widely used for the organic compounds, and could further promote qNMR to become a primary method in the international metrological systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Is SPECT or CT Based Attenuation Correction More Quantitatively Accurate for Dedicated Breast SPECT Acquired with Non-Traditional Trajectories?

    Science.gov (United States)

    Perez, Kristy L.; Mann, Steve D.; Pachon, Jan H.; Madhav, Priti; Tornai, Martin P.

    2015-01-01

    Attenuation correction is necessary for SPECT quantification. There are a variety of methods to create attenuation maps. For dedicated breast SPECT imaging, it is unclear if either SPECT- or CT-based attenuation map would provide the most accurate quantification and whether or not segmenting the different tissue types will have an effect on the qunatification. For these experiments, 99mTc diluted in methanol and water was filled into geometric and anthropomorphic breast phantoms and was imaged with a dedicated dual-modality SPECT-CT scanner. SPECT images were collected using a compact CZT camera with various 3D acquisitions including vertical and 30° tilted parallel beam, and complex sinusoidal trajectories. CT images were acquired using a quasi-monochromatic x-ray source and CsI(T1) flat panel digital detector in a half-cone beam geometry. Measured scatter correction for SPECT and CT were implemented. To compare photon attenuation correction in the reconstructed SPECT images, various volumetric attenuation matrices were derived from 1) uniform SPECT, 2) uniform CT, and 3) segmented CT, populated with different attenuation coefficient values. Comparisons between attenuation masks using phantoms consisting of materials with different attenuation values show that at 140 keV the differences in the attenuation between materials do not affect the quantification as much as the size and alignment of the attenuation map. The CT-based attenuation maps give quantitative values 30% below the actual value, but are consistent. While the SPECT-based attenuation maps can provide within 10% accurate quantitative values, but are less consistent. PMID:25999683

  17. Correction of Gradient Nonlinearity Bias in Quantitative Diffusion Parameters of Renal Tissue with Intra Voxel Incoherent Motion.

    Science.gov (United States)

    Malyarenko, Dariya I; Pang, Yuxi; Senegas, Julien; Ivancevic, Marko K; Ross, Brian D; Chenevert, Thomas L

    2015-12-01

    Spatially non-uniform diffusion weighting bias due to gradient nonlinearity (GNL) causes substantial errors in apparent diffusion coefficient (ADC) maps for anatomical regions imaged distant from magnet isocenter. Our previously-described approach allowed effective removal of spatial ADC bias from three orthogonal DWI measurements for mono-exponential media of arbitrary anisotropy. The present work evaluates correction feasibility and performance for quantitative diffusion parameters of the two-component IVIM model for well-perfused and nearly isotropic renal tissue. Sagittal kidney DWI scans of a volunteer were performed on a clinical 3T MRI scanner near isocenter and offset superiorly. Spatially non-uniform diffusion weighting due to GNL resulted both in shift and broadening of perfusion-suppressed ADC histograms for off-center DWI relative to unbiased measurements close to isocenter. Direction-average DW-bias correctors were computed based on the known gradient design provided by vendor. The computed bias maps were empirically confirmed by coronal DWI measurements for an isotropic gel-flood phantom. Both phantom and renal tissue ADC bias for off-center measurements was effectively removed by applying pre-computed 3D correction maps. Comparable ADC accuracy was achieved for corrections of both b-maps and DWI intensities in presence of IVIM perfusion. No significant bias impact was observed for IVIM perfusion fraction.

  18. Quantitative FRET measurement using emission-spectral unmixing with independent excitation crosstalk correction.

    Science.gov (United States)

    Zhang, J; Li, H; Chai, L; Zhang, L; Qu, J; Chen, T

    2015-02-01

    Quantification of fluorescence resonance energy transfer (FRET) needs at least two external samples, an acceptor-only reference and a linked FRET reference, to calibrate fluorescence signal. Furthermore, all measurements for references and FRET samples must be performed under the same instrumental conditions. Based on a novel notion to predetermine the molar extinction coefficient ratio (RC ) of acceptor-to-donor for the correction of acceptor excitation crosstalk, we present here a robust and independent emission-spectral unmixing FRET methodology, Iem-spFRET, which can simultaneously measure the E and RC of FRET sample without any external references, such that Iem-spFRET circumvents the rigorous restriction of keeping the same imaging conditions for all FRET experiments and thus can be used for the direct measurement of FRET sample. We validate Iem-spFRET by measuring the absolute E and RC values of standard constructs with different acceptor-to-donor stoichiometry expressed in living cells. Our results demonstrate that Iem-spFRET is a simple and powerful tool for real-time monitoring the dynamic intermolecular interaction within single living cells. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  19. Quantifying and correcting for the winner's curse in quantitative-trait association studies.

    Science.gov (United States)

    Xiao, Rui; Boehnke, Michael

    2011-04-01

    Quantitative traits (QT) are an important focus of human genetic studies both because of interest in the traits themselves and because of their role as risk factors for many human diseases. For large-scale QT association studies including genome-wide association studies, investigators usually focus on genetic loci showing significant evidence for SNP-QT association, and genetic effect size tends to be overestimated as a consequence of the winner's curse. In this paper, we study the impact of the winner's curse on QT association studies in which the genetic effect size is parameterized as the slope in a linear regression model. We demonstrate by analytical calculation that the overestimation in the regression slope estimate decreases as power increases. To reduce the ascertainment bias, we propose a three-parameter maximum likelihood method and then simplify this to a one-parameter method by excluding nuisance parameters. We show that both methods reduce the bias when power to detect association is low or moderate, and that the one-parameter model generally results in smaller variance in the estimate. © 2011 Wiley-Liss, Inc.

  20. Validation of satellite-based precipitation estimates over different African River Basins

    Science.gov (United States)

    Thiemig, V.; Rojas, R.; Levizzani, V.; De Roo, A.

    2012-04-01

    Satellite-based precipitation products have become increasingly available and accessible in near real-time, encouraging the scientific community increasingly to use these data to replace or supplement sparse ground observations. Six satellite-based rainfall estimates (SRFE), namely, CMORPH, RFE 2.0, TRMM 3B42, GPROF 6.0, PERSIANN, GSMaP-MKV, and one reanalysis product (ERA-interim) are validated against rain gauge data over four partly sparsely-gauged African river basins (Zambezi, Volta, Juba-Shabelle and Baro-Akobo). The objective is to provide the scientific community using SRFE as input data for hydro-meteorological applications an intercomparable validation study of these products over different hydro-climatological conditions in Africa. The validation focuses on the general ability of the SRFE products to reproduce daily and monthly rainfall and, particularly, on rainfall characteristics that are relevant to hydro-meteorological applications, such as, annual catchment totals, spatial distribution pattern within the river basin, seasonality of precipitation, number of rainy days per year, and timing and amount of heavy rainfall events. The accuracy of those products is assessed using a ground observation network, comprising of 203 stations with daily records between 2003 and 2006 (data coverage: 75 % of data for 38, 13, 18 and 31 % of stations, respectively). Considering the time and space variability of the different rainfall characteristics as well as the conventional hydrological working units, the validation is done on three spatially-aggregated levels: point, subcatchment, and river basin. For the latter two the ground observations are interpolated using Kriging with External Drift, where the drift is defined as the terrain elevation. The performance is measured using standard statistical measures (MAE, RMSE, pBIAS, r, and NSeff) as well as visual inspection. The examined products showed depending on the spatially-aggregated level they have been analyzed

  1. Development and validation of satellite based estimates of surface visibility

    OpenAIRE

    Brunner, J.; R. B. Pierce; A. Lenzen

    2015-01-01

    A satellite based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorol...

  2. Validation of an Innovative Satellite-Based UV Dosimeter

    Science.gov (United States)

    Morelli, Marco; Masini, Andrea; Simeone, Emilio; Khazova, Marina

    2016-08-01

    We present an innovative satellite-based UV (ultraviolet) radiation dosimeter with a mobile app interface that has been validated by exploiting both ground-based measurements and an in-vivo assessment of the erythemal effects on some volunteers having a controlled exposure to solar radiation.Both validations showed that the satellite-based UV dosimeter has a good accuracy and reliability needed for health-related applications.The app with this satellite-based UV dosimeter also includes other related functionalities such as the provision of safe sun exposure time updated in real-time and end exposure visual/sound alert. This app will be launched on the global market by siHealth Ltd in May 2016 under the name of "HappySun" and available both for Android and for iOS devices (more info on http://www.happysun.co.uk).Extensive R&D activities are on-going for further improvement of the satellite-based UV dosimeter's accuracy.

  3. Quantitation of regional cerebral blood flow corrected for partial volume effect using O-15 water and PET

    DEFF Research Database (Denmark)

    Iida, H; Law, I; Pakkenberg, B

    2000-01-01

    Limited spatial resolution of positron emission tomography (PET) can cause significant underestimation in the observed regional radioactivity concentration (so-called partial volume effect or PVE) resulting in systematic errors in estimating quantitative physiologic parameters. The authors have...... in the corresponding ROIs. Reproducibility of the estimated parameters and sensitivity to various error sources were also evaluated. All models tested in the current study yielded PVE-corrected regional CBF values (approximately 0.8 mL x min(-1) x g(-1) for models with a term for gray matter tissue and 0.5 mL x min(-1...... that included two parallel tissue compartments demonstrated better results with regards to the agreement of tissue time-activity curve and the Akaike's Information Criteria. Error sensitivity analysis suggested the model that fits three parameters of the gray matter CBF, the gray matter fraction, and the white...

  4. Rapid in-focus corrections on quantitative amplitude and phase imaging using transport of intensity equation method.

    Science.gov (United States)

    Meng, X; Tian, X; Kong, Y; Sun, A; Yu, W; Qian, W; Song, X; Cui, H; Xue, L; Liu, C; Wang, S

    2017-06-01

    Transport of intensity equation (TIE) method can acquire sample phase distributions with high speed and accuracy, offering another perspective for cellular observations and measurements. However, caused by incorrect focal plane determination, blurs and halos are induced, decreasing resolution and accuracy in both retrieved amplitude and phase information. In order to obtain high-accurate sample details, we propose TIE based in-focus correction technique for quantitative amplitude and phase imaging, which can locate focal plane and then retrieve both in-focus intensity and phase distributions combining with numerical wavefront extraction and propagation as well as physical image recorder translation. Certified by both numerical simulations and practical measurements, it is believed the proposed method not only captures high-accurate in-focus sample information, but also provides a potential way for fast autofocusing in microscopic system. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  5. An independent component analysis confounding factor correction framework for identifying broad impact expression quantitative trait loci.

    Science.gov (United States)

    Ju, Jin Hyun; Shenoy, Sushila A; Crystal, Ronald G; Mezey, Jason G

    2017-05-01

    Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In

  6. Corrected coronary opacification decrease from coronary computed tomography angiography: Validation with quantitative 13N-ammonia positron emission tomography.

    Science.gov (United States)

    Benz, Dominik C; Gräni, Christoph; Ferro, Paola; Neumeier, Luis; Messerli, Michael; Possner, Mathias; Clerc, Olivier F; Gebhard, Catherine; Gaemperli, Oliver; Pazhenkottil, Aju P; Kaufmann, Philipp A; Buechel, Ronny R

    2017-07-06

    To assess the functional relevance of a coronary artery stenosis, corrected coronary opacification (CCO) decrease derived from coronary computed tomography angiography (CCTA) has been proposed. The present study aims at validating CCO decrease with quantitative 13N-ammonia positron emission tomography (PET) myocardial perfusion imaging (MPI). This retrospective study consists of 39 patients who underwent hybrid CCTA/PET-MPI. From CCTA, attenuation in the coronary lumen was measured before and after a stenosis and corrected to the aorta to calculate CCO and its decrease. Relative flow reserve (RFR) was calculated by dividing the stress myocardial blood flow (MBF) of a vessel territory subtended by a stenotic coronary by the stress MBF of the reference territories without stenoses. RFR was abnormal in 11 vessel territories (27%). CCO decrease yielded a sensitivity, specificity, negative predictive value, positive predictive value, and accuracy for prediction of an abnormal RFR of 73%, 70%, 88%, 47%, and 70%, respectively. CCTA-derived CCO decrease has moderate diagnostic accuracy to predict an abnormal RFR in PET-MPI. However, its high negative predictive value to rule out functional relevance of a given lesion may confer clinical implications in the diagnostic work-up of patients with a coronary stenosis.

  7. Quantitative micro-Raman analysis of volcanic glasses: influence and correction of matrix effects

    Science.gov (United States)

    Di Muro, Andrea

    2014-05-01

    Micro-Raman spectroscopy, even though a very promising micro-analytical technique, is still not used to routinely quantify volatile elements dissolved in glasses. Following an original idea of Galeener and Mikkelsen (1981) for the quantification of hydroxyl (OH) in silica glass, several quantitative procedures have been recently proposed for the analysis of water, sulphur and carbon in natural glasses (obsidians, pumices, melt inclusions). The quantification of a single analyte requires the calibration of the correlation between the intensity I (height or area) of the related Raman band, normalized or not to a reference band RB, and the analyte concentration. For the analysis of alumino-silicate glasses, RB corresponds to one of the two main envelopes (LF and HF) related to the vibration of the glass network. Calibrations are linear, provided the increase in the analyte concentration does not dramatically affect RB intensity. Much attention has been paid to identify the most appropriate spectral treatment (spectra reduction; baseline subtraction; etc) to achieve accurate measurement of band intensities. I here show that the accuracy of Raman procedures for volatile quantification critically depends on the capability in predicting and in taking into account the influence of multiple matrix effects, which are often correlated with the average polymerization degree of the glass network. A general model has been developed to predict matrix effects affecting micro-Raman analysis of natural glasses. The specific and critical influence of iron redox state and pressure are discussed. The approach has been extensively validated for the study of melt inclusions and matrices spanning a broad range of compositions and dissolved volatile contents. References Analytical procedures Mercier, M, Di Muro, A., Métrich, N., Giordano, D., Belhadj, O., Mandeville, C.W. (2010) Spectroscopic analysis (FTIR, Raman) of water in mafic and intermediate glasses and glass inclusions

  8. Comparison of Satellite-based Basal and Adjusted Evapotranspiration for Several California Crops

    Science.gov (United States)

    Johnson, L.; Lund, C.; Melton, F. S.

    2013-12-01

    There is a continuing need to develop new sources of information on agricultural crop water consumption in the arid Western U.S. Pursuant to the California Water Conservation Act of 2009, for instance, the stakeholder community has developed a set of quantitative indicators involving measurement of evapotranspiration (ET) or crop consumptive use (Calif. Dept. Water Resources, 2012). Fraction of reference ET (or, crop coefficients) can be estimated from a biophysical description of the crop canopy involving green fractional cover (Fc) and height as per the FAO-56 practice standard of Allen et al. (1998). The current study involved 19 fields in California's San Joaquin Valley and Central Coast during 2011-12, growing a variety of specialty and commodity crops: lettuce, raisin, tomato, almond, melon, winegrape, garlic, peach, orange, cotton, corn and wheat. Most crops were on surface or subsurface drip, though micro-jet, sprinkler and flood were represented as well. Fc was retrospectively estimated every 8-16 days by optical satellite data and interpolated to a daily timestep. Crop height was derived as a capped linear function of Fc using published guideline maxima. These variables were used to generate daily basal crop coefficients (Kcb) per field through most or all of each respective growth cycle by the density coefficient approach of Allen & Pereira (2009). A soil water balance model for both topsoil and root zone, based on FAO-56 and using on-site measurements of applied irrigation and precipitation, was used to develop daily soil evaporation and crop water stress coefficients (Ke, Ks). Key meteorological variables (wind speed, relative humidity) were extracted from the California Irrigation Management Information System (CIMIS) for climate correction. Basal crop ET (ETcb) was then derived from Kcb using CIMIS reference ET. Adjusted crop ET (ETc_adj) was estimated by the dual coefficient approach involving Kcb, Ke, and incorporating Ks. Cumulative ETc

  9. Detecting weather radar clutter using satellite-based nowcasting products

    DEFF Research Database (Denmark)

    Jensen, Thomas B.S.; Gill, Rashpal S.; Overgaard, Søren

    2006-01-01

    This contribution presents the initial results from experiments with detection of weather radar clutter by information fusion with satellite based nowcasting products. Previous studies using information fusion of weather radar data and first generation Meteosat imagery have shown promising results...... for the detecting and removal of clutter. Naturally, the improved spatio-temporal resolution of the Meteosat Second Generation sensors, coupled with its increased number of spectral bands, is expected to yield even better detection accuracies. Weather radar data from three C-band Doppler weather radars...... of the Danish Meteorological Institute has been extracted for cases of severe to moderate cases of land and sea clutter. For comparison, cases of clutter free data has also been analyzed. The satellite-based dataset used is an operational meteorological product developed within the 'Nowcasting Satellite...

  10. Operational Testing of Satellite based Hydrological Model (SHM)

    Science.gov (United States)

    Gaur, Srishti; Paul, Pranesh Kumar; Singh, Rajendra; Mishra, Ashok; Gupta, Praveen Kumar; Singh, Raghavendra P.

    2017-04-01

    Incorporation of the concept of transposability in model testing is one of the prominent ways to check the credibility of a hydrological model. Successful testing ensures ability of hydrological models to deal with changing conditions, along with its extrapolation capacity. For a newly developed model, a number of contradictions arises regarding its applicability, therefore testing of credibility of model is essential to proficiently assess its strength and limitations. This concept emphasizes to perform 'Hierarchical Operational Testing' of Satellite based Hydrological Model (SHM), a newly developed surface water-groundwater coupled model, under PRACRITI-2 program initiated by Space Application Centre (SAC), Ahmedabad. SHM aims at sustainable water resources management using remote sensing data from Indian satellites. It consists of grid cells of 5km x 5km resolution and comprises of five modules namely: Surface Water (SW), Forest (F), Snow (S), Groundwater (GW) and Routing (ROU). SW module (functions in the grid cells with land cover other than forest and snow) deals with estimation of surface runoff, soil moisture and evapotranspiration by using NRCS-CN method, water balance and Hragreaves method, respectively. The hydrology of F module is dependent entirely on sub-surface processes and water balance is calculated based on it. GW module generates baseflow (depending on water table variation with the level of water in streams) using Boussinesq equation. ROU module is grounded on a cell-to-cell routing technique based on the principle of Time Variant Spatially Distributed Direct Runoff Hydrograph (SDDH) to route the generated runoff and baseflow by different modules up to the outlet. For this study Subarnarekha river basin, flood prone zone of eastern India, has been chosen for hierarchical operational testing scheme which includes tests under stationary as well as transitory conditions. For this the basin has been divided into three sub-basins using three flow

  11. On the ionospheric impact of recent storm events on satellite-based augmentation systems in middle and low-latitude sectors

    Science.gov (United States)

    Komjathy, Attila; Sparks, Lawrence; Mannucci, Anthony J.; Pi, Xiaoqing

    2003-01-01

    The Ionospheric correction algorithms have been characterized extensively for the mid-latitude region of the ionosphere where benign conditions usually exist. The United States Federal Aviation Administration's (FAA) Wide Area Augmentation System (WAAS) for civil aircraft navigation is focused primarily on the Conterminous United States (CONUS). Other Satellite-based Augmentation Systems (SBAS) include the European Geostationary Navigation Overlay Service (EGNOS) and the Japanese Global Navigation Satellite System (MSAS). Researchers are facing a more serious challenge in addressing the ionospheric impact on navigation using SBAS in other parts of the world such as the South American region on India. At equatorial latitudes, geophysical conditions lead to the so-called Appleton-Hartree (equatorial) anomaly phenomenon, which results in significantly larger ionospheric range delays and range delay spatial gradients than is observed in the CONUS or European sectors. In this paper, we use GPS measurements of geomagnetic storm days to perform a quantitative assessment of WAAS-type ionospheric correction algorithms in other parts of the world such as the low-latitude Brazil and mid-latitude Europe. For the study, we access a world-wide network of 400+ dual frequency GPS receivers.

  12. The Effects of Corrective Feedback on Chinese Learners' Writing Accuracy: A Quantitative Analysis in an EFL Context

    Science.gov (United States)

    Wang, Xin

    2017-01-01

    Scholars debate whether corrective feedback contributes to improving L2 learners' grammatical accuracy in writing performance. Some researchers take a stance on the ineffectiveness of corrective feedback based on the impracticality of providing detailed corrective feedback for all L2 learners and detached grammar instruction in language…

  13. The Future of Satellite-based Lightning Detection

    Science.gov (United States)

    Bocippio, Dennis J.; Christian, Hugh J.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The future of satellite-based optical lightning detection, beyond the end of the current TRMM mission, is discussed. Opportunities for new low-earth orbit missions are reviewed. The potential for geostationary observations is significant; such observations provide order-of-magnitude gains in sampling and data efficiency over existing satellite convective observations. The feasibility and performance (resolution, sensitivity) of geostationary measurements using current technology is discussed. In addition to direct and continuous hemispheric observation of lighting, geostationary measurements have the potential (through data assimilation) to dramatically improve short and medium range forecasts, offering benefits to prediction of NOx productions and/or vertical transport.

  14. Satellite based wind resource assessment over the South China Sea

    DEFF Research Database (Denmark)

    Badger, Merete; Astrup, Poul; Hasager, Charlotte Bay

    2014-01-01

    modeling to develop procedures and best practices for satellite based wind resource assessment offshore. All existing satellite images from the Envisat Advanced SAR sensor by the European Space Agency (2002-12) have been collected over a domain in the South China Sea. Wind speed is first retrieved from...... description in order to calculate the mean wind climate at different levels up to 100 m. Time series from coarser-resolution satellite wind products i.e. the Special Sensor Microwave Imager (SSM/I) data are used to calculate the long-term temporal variability of the wind climate. This can be used...

  15. Impact of CT attenuation correction method on quantitative respiratory-correlated (4D) PET/CT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Nyflot, Matthew J., E-mail: nyflot@uw.edu [Department of Radiation Oncology, University of Washington, Seattle, Washington 98195-6043 (United States); Lee, Tzu-Cheng [Department of Bioengineering, University of Washington, Seattle, Washington 98195-6043 (United States); Alessio, Adam M.; Kinahan, Paul E. [Department of Radiology, University of Washington, Seattle, Washington 98195-6043 (United States); Wollenweber, Scott D.; Stearns, Charles W. [GE Healthcare, Waukesha, Wisconsin 53188 (United States); Bowen, Stephen R. [Department of Radiation Oncology, University of Washington, Seattle, Washington 98195-6043 and Department of Radiology, University of Washington, Seattle, Washington 98195-6043 (United States)

    2015-01-15

    Purpose: Respiratory-correlated positron emission tomography (PET/CT) 4D PET/CT is used to mitigate errors from respiratory motion; however, the optimal CT attenuation correction (CTAC) method for 4D PET/CT is unknown. The authors performed a phantom study to evaluate the quantitative performance of CTAC methods for 4D PET/CT in the ground truth setting. Methods: A programmable respiratory motion phantom with a custom movable insert designed to emulate a lung lesion and lung tissue was used for this study. The insert was driven by one of five waveforms: two sinusoidal waveforms or three patient-specific respiratory waveforms. 3DPET and 4DPET images of the phantom under motion were acquired and reconstructed with six CTAC methods: helical breath-hold (3DHEL), helical free-breathing (3DMOT), 4D phase-averaged (4DAVG), 4D maximum intensity projection (4DMIP), 4D phase-matched (4DMATCH), and 4D end-exhale (4DEXH) CTAC. Recovery of SUV{sub max}, SUV{sub mean}, SUV{sub peak}, and segmented tumor volume was evaluated as RC{sub max}, RC{sub mean}, RC{sub peak}, and RC{sub vol}, representing percent difference relative to the static ground truth case. Paired Wilcoxon tests and Kruskal–Wallis ANOVA were used to test for significant differences. Results: For 4DPET imaging, the maximum intensity projection CTAC produced significantly more accurate recovery coefficients than all other CTAC methods (p < 0.0001 over all metrics). Over all motion waveforms, ratios of 4DMIP CTAC recovery were 0.2 ± 5.4, −1.8 ± 6.5, −3.2 ± 5.0, and 3.0 ± 5.9 for RC{sub max}, RC{sub peak}, RC{sub mean}, and RC{sub vol}. In comparison, recovery coefficients for phase-matched CTAC were −8.4 ± 5.3, −10.5 ± 6.2, −7.6 ± 5.0, and −13.0 ± 7.7 for RC{sub max}, RC{sub peak}, RC{sub mean}, and RC{sub vol}. When testing differences between phases over all CTAC methods and waveforms, end-exhale phases were significantly more accurate (p = 0.005). However, these differences were driven by

  16. Satellite-based detection of volcanic sulphur dioxide from recent eruptions in Central and South America

    Directory of Open Access Journals (Sweden)

    D. Loyola

    2008-01-01

    Full Text Available Volcanic eruptions can emit large amounts of rock fragments and fine particles (ash into the atmosphere, as well as several gases, including sulphur dioxide (SO2. These ejecta and emissions are a major natural hazard, not only to the local population, but also to the infrastructure in the vicinity of volcanoes and to aviation. Here, we describe a methodology to retrieve quantitative information about volcanic SO2 plumes from satellite-borne measurements in the UV/Visible spectral range. The combination of a satellite-based SO2 detection scheme and a state-of-the-art 3D trajectory model enables us to confirm the volcanic origin of trace gas signals and to estimate the plume height and the effective emission height. This is demonstrated by case-studies for four selected volcanic eruptions in South and Central America, using the GOME, SCIAMACHY and GOME-2 instruments.

  17. [Surveying a zoological facility through satellite-based geodesy].

    Science.gov (United States)

    Böer, M; Thien, W; Tölke, D

    2000-06-01

    In the course of a thesis submitted for a diploma degree within the Fachhochschule Oldenburg the Serengeti Safaripark was surveyed in autumn and winter 1996/97 laying in the planning foundations for the application for licences from the controlling authorities. Taking into consideration the special way of keeping animals in the Serengeti Safaripark (game ranching, spacious walk-through-facilities) the intention was to employ the outstanding satellite based geodesy. This technology relies on special aerials receiving signals from 24 satellites which circle around the globe. These data are being gathered and examined. This examination produces the exact position of this aerial in a system of coordinates which allows depicting this point on a map. This procedure was used stationary (from a strictly defined point) as well as in the movement (in a moving car). Additionally conventional procedures were used when the satellite based geodesy came to its limits. Finally a detailed map of the Serengeti Safaripark was created which shows the position and size of stables and enclosures as well as wood and water areas and the sectors of the leisure park. Furthermore the established areas of the enclosures together with an already existing animal databank have flown into an information system with the help of which the stock of animals can be managed enclosure-orientated.

  18. Influence of relativistic effects on satellite-based clock synchronization

    Science.gov (United States)

    Wang, Jieci; Tian, Zehua; Jing, Jiliang; Fan, Heng

    2016-03-01

    Clock synchronization between the ground and satellites is a fundamental issue in future quantum telecommunication, navigation, and global positioning systems. Here, we propose a scheme of near-Earth orbit satellite-based quantum clock synchronization with atmospheric dispersion cancellation by taking into account the spacetime background of the Earth. Two frequency entangled pulses are employed to synchronize two clocks, one at a ground station and the other at a satellite. The time discrepancy of the two clocks is introduced into the pulses by moving mirrors and is extracted by measuring the coincidence rate of the pulses in the interferometer. We find that the pulses are distorted due to effects of gravity when they propagate between the Earth and the satellite, resulting in remarkably affected coincidence rates. We also find that the precision of the clock synchronization is sensitive to the source parameters and the altitude of the satellite. The scheme provides a solution for satellite-based quantum clock synchronization with high precision, which can be realized, in principle, with current technology.

  19. Satellite-based entanglement distribution over 1200 kilometers.

    Science.gov (United States)

    Yin, Juan; Cao, Yuan; Li, Yu-Huai; Liao, Sheng-Kai; Zhang, Liang; Ren, Ji-Gang; Cai, Wen-Qi; Liu, Wei-Yue; Li, Bo; Dai, Hui; Li, Guang-Bing; Lu, Qi-Ming; Gong, Yun-Hong; Xu, Yu; Li, Shuang-Lin; Li, Feng-Zhi; Yin, Ya-Yun; Jiang, Zi-Qing; Li, Ming; Jia, Jian-Jun; Ren, Ge; He, Dong; Zhou, Yi-Lin; Zhang, Xiao-Xiang; Wang, Na; Chang, Xiang; Zhu, Zhen-Cai; Liu, Nai-Le; Chen, Yu-Ao; Lu, Chao-Yang; Shu, Rong; Peng, Cheng-Zhi; Wang, Jian-Yu; Pan, Jian-Wei

    2017-06-16

    Long-distance entanglement distribution is essential for both foundational tests of quantum physics and scalable quantum networks. Owing to channel loss, however, the previously achieved distance was limited to ~100 kilometers. Here we demonstrate satellite-based distribution of entangled photon pairs to two locations separated by 1203 kilometers on Earth, through two satellite-to-ground downlinks with a summed length varying from 1600 to 2400 kilometers. We observed a survival of two-photon entanglement and a violation of Bell inequality by 2.37 ± 0.09 under strict Einstein locality conditions. The obtained effective link efficiency is orders of magnitude higher than that of the direct bidirectional transmission of the two photons through telecommunication fibers. Copyright © 2017, American Association for the Advancement of Science.

  20. SU-G-IeP1-06: Estimating Relative Tissue Density From Quantitative MR Images: A Novel Perspective for MRI-Only Heterogeneity Corrected Dose Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Soliman, A; Hashemi, M; Safigholi, H [Sunnybrook Research Institute, Toronto, ON (Canada); Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Tchistiakova, E [Sunnybrook Health Sciences Centre, Toronto, ON (Canada); University of Toronto, Toronto, ON (Canada); Song, W [Sunnybrook Research Institute, Toronto, ON (Canada); Sunnybrook Health Sciences Centre, Toronto, ON (Canada); University of Toronto, Toronto, ON (Canada)

    2016-06-15

    Purpose: To explore the feasibility of extracting the relative density from quantitative MRI measurements as well as estimate a correlation between the extracted measures and CT Hounsfield units. Methods: MRI has the ability to separate water and fat signals, producing two separate images for each component. By performing appropriate corrections on the separated images, quantitative measurement of water and fat mass density can be estimated. This work aims to test this hypothesis on 1.5T.Peanut oil was used as fat-representative, while agar as water-representative. Gadolinium Chloride III and Sodium Chloride were added to the agar solution to adjust the relaxation times and the medium conductivity, respectively. Peanut oil was added to the agar solution with different percentages: 0%, 3%, 5%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90% and 100%. The phantom was scanned on 1.5T GE Optima 450W with the body coil using a multigradient echo sequences. Water/fat separation were performed while correcting for main field (B0) inhomogeneity and T{sub 2}* relaxation time. B1+ inhomogeneities were ignored. The phantom was subsequently scanned on a Philips Brilliance CT Big Bore. MR-corrected fat signal from all vials were normalized to 100% fat signal. CT Hounsfield values were then compared to those obtained from the normalized MR-corrected fat values as well as to the phantom for validation. Results: Good agreement were found between CT HU and the MR-extracted fat values (R{sup 2} = 0.98). CT HU also showed excellent agreement with the prepared fat fractions (R{sup 2}=0.99). Vials with 70%, 80%, and 90% fat percentages showed inhomogeneous distributions, however their results were included for completion. Conclusion: Quantitative MRI water/fat imaging can be potentially used to extract the relative tissue density. Further in-vivo validation are required.

  1. South African Weather Service operational satellite based precipitation estimation technique: applications and improvements

    Directory of Open Access Journals (Sweden)

    E. de Coning

    2011-04-01

    Full Text Available Extreme weather related to heavy or more frequent precipitation events seem to be a likely possibility for the future of our planet. While precipitation measurements can be done by means of rain gauges, the obvious disadvantages of point measurements are driving meteorologists towards remotely sensed precipitation methods. In South Africa more sophisticated and expensive nowcasting technology such as radar and lightning networks are available, supported by a fairly dense rain gauge network of about 1500 daily gauges. In the rest of southern Africa rainfall measurements are more difficult to obtain. The local version of the Unified Model and the Meteosat Second Generation satellite data are ideal components of precipitation estimation in data sparse regions such as Africa. In South Africa hourly accumulations of the Hydroestimator (originally from NOAA/NESDIS are currently used as a satellite based precipitation estimator for the South African Flash Flood Guidance system, especially in regions which are not covered by radar. In this study the Hydroestimator and the stratiform rainfall field from the Unified Model are both bias corrected and then combined into a new precipitation field. The new product was tested over a two year period and provides a more accurate and comprehensive input to the Flash Flood Guidance systems in the data sparse southern Africa. Future work will include updating the period over which bias corrections were calculated.

  2. Satellite-based emission constraint for nitrogen oxides: Capability and uncertainty

    Science.gov (United States)

    Lin, J.; McElroy, M. B.; Boersma, F.; Nielsen, C.; Zhao, Y.; Lei, Y.; Liu, Y.; Zhang, Q.; Liu, Z.; Liu, H.; Mao, J.; Zhuang, G.; Roozendael, M.; Martin, R.; Wang, P.; Spurr, R. J.; Sneep, M.; Stammes, P.; Clemer, K.; Irie, H.

    2013-12-01

    Vertical column densities (VCDs) of tropospheric nitrogen dioxide (NO2) retrieved from satellite remote sensing have been employed widely to constrain emissions of nitrogen oxides (NOx). A major strength of satellite-based emission constraint is analysis of emission trends and variability, while a crucial limitation is errors both in satellite NO2 data and in model simulations relating NOx emissions to NO2 columns. Through a series of studies, we have explored these aspects over China. We separate anthropogenic from natural sources of NOx by exploiting their different seasonality. We infer trends of NOx emissions in recent years and effects of a variety of socioeconomic events at different spatiotemporal scales including the general economic growth, global financial crisis, Chinese New Year, and Beijing Olympics. We further investigate the impact of growing NOx emissions on particulate matter (PM) pollution in China. As part of recent developments, we identify and correct errors in both satellite NO2 retrieval and model simulation that ultimately affect NOx emission constraint. We improve the treatments of aerosol optical effects, clouds and surface reflectance in the NO2 retrieval process, using as reference ground-based MAX-DOAS measurements to evaluate the improved retrieval results. We analyze the sensitivity of simulated NO2 to errors in the model representation of major meteorological and chemical processes with a subsequent correction of model bias. Future studies will implement these improvements to re-constrain NOx emissions.

  3. Evaluating the hydrological consistency of satellite based water cycle components

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2016-06-15

    Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological

  4. SAMIRA - SAtellite based Monitoring Initiative for Regional Air quality

    Science.gov (United States)

    Schneider, Philipp; Stebel, Kerstin; Ajtai, Nicolae; Diamandi, Andrei; Horalek, Jan; Nicolae, Doina; Stachlewska, Iwona; Zehner, Claus

    2016-04-01

    Here, we present a new ESA-funded project entitled Satellite based Monitoring Initiative for Regional Air quality (SAMIRA), which aims at improving regional and local air quality monitoring through synergetic use of data from present and upcoming satellites, traditionally used in situ air quality monitoring networks and output from chemical transport models. Through collaborative efforts in four countries, namely Romania, Poland, the Czech Republic and Norway, all with existing air quality problems, SAMIRA intends to support the involved institutions and associated users in their national monitoring and reporting mandates as well as to generate novel research in this area. Despite considerable improvements in the past decades, Europe is still far from achieving levels of air quality that do not pose unacceptable hazards to humans and the environment. Main concerns in Europe are exceedances of particulate matter (PM), ground-level ozone, benzo(a)pyrene (BaP) and nitrogen dioxide (NO2). While overall sulfur dioxide (SO2) emissions have decreased in recent years, regional concentrations can still be high in some areas. The objectives of SAMIRA are to improve algorithms for the retrieval of hourly aerosol optical depth (AOD) maps from SEVIRI, and to develop robust methods for deriving column- and near-surface PM maps for the study area by combining satellite AOD with information from regional models. The benefit to existing monitoring networks (in situ, models, satellite) by combining these datasets using data fusion methods will be tested for satellite-based NO2, SO2, and PM/AOD. Furthermore, SAMIRA will test and apply techniques for downscaling air quality-related EO products to a spatial resolution that is more in line with what is generally required for studying urban and regional scale air quality. This will be demonstrated for a set of study sites that include the capitals of the four countries and the highly polluted areas along the border of Poland and the

  5. Global root zone storage capacity from satellite-based evaporation

    Science.gov (United States)

    Wang-Erlandsson, Lan; Bastiaanssen, Wim G. M.; Gao, Hongkai; Jägermeyr, Jonas; Senay, Gabriel B.; van Dijk, Albert I. J. M.; Guerschman, Juan P.; Keys, Patrick W.; Gordon, Line J.; Savenije, Hubert H. G.

    2016-04-01

    This study presents an "Earth observation-based" method for estimating root zone storage capacity - a critical, yet uncertain parameter in hydrological and land surface modelling. By assuming that vegetation optimises its root zone storage capacity to bridge critical dry periods, we were able to use state-of-the-art satellite-based evaporation data computed with independent energy balance equations to derive gridded root zone storage capacity at global scale. This approach does not require soil or vegetation information, is model independent, and is in principle scale independent. In contrast to a traditional look-up table approach, our method captures the variability in root zone storage capacity within land cover types, including in rainforests where direct measurements of root depths otherwise are scarce. Implementing the estimated root zone storage capacity in the global hydrological model STEAM (Simple Terrestrial Evaporation to Atmosphere Model) improved evaporation simulation overall, and in particular during the least evaporating months in sub-humid to humid regions with moderate to high seasonality. Our results suggest that several forest types are able to create a large storage to buffer for severe droughts (with a very long return period), in contrast to, for example, savannahs and woody savannahs (medium length return period), as well as grasslands, shrublands, and croplands (very short return period). The presented method to estimate root zone storage capacity eliminates the need for poor resolution soil and rooting depth data that form a limitation for achieving progress in the global land surface modelling community.

  6. Development and validation of satellite based estimates of surface visibility

    Science.gov (United States)

    Brunner, J.; Pierce, R. B.; Lenzen, A.

    2015-10-01

    A satellite based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5% for classifying Clear (V ≥ 30 km), Moderate (10 km ≤ V GOES-R ABI visibility retrieval can be used to augment measurements from the United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network, and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.

  7. Correction of hypervolaemic hypernatraemia by inducing negative Na+ and K+ balance in excess of negative water balance: a new quantitative approach.

    Science.gov (United States)

    Nguyen, Minhtri K; Kurtz, Ira

    2008-07-01

    Hypervolemic hypernatremia is caused by an increase in total exchangeable Na(+) and K(+) in excess of an increment in total body H(2)O (TBW). Unlike patients with hypovolemic or euvolemic hypernatremia, treatment needs to be targeted at correcting not only the elevated plasma Na(+) concentration, but also there is an additional requirement to achieve negative H(2)O balance to correct the increment in TBW. Correction of hypervolemic hypernatremia can be attained by ensuring that the negative Na(+) and K(+) balance exceeds the negative H(2)O balance. These seemingly conflicting therapeutic goals are typically approached by administering intravenous 5% Dextrose (IV D5W) and furosemide. Currently, there is no quantitative approach to predicting the volume of IV D5W (V(IVF)) that needs to be administered that satisfies these requirements. Therefore, based on the principle of mass balance and the empirical relationship between exchangeable Na(+), K(+), TBW, and the plasma Na(+) concentration, we have derived a new equation which calculates the volume of IV D5W (V(IVF)) needed to lower the plasma Na(+) concentration ([Na(+)](p1)) to a targeted level ([Na(+)](p2)) by achieving the desired amount of negative H(2)O balance (V(MB)): V(IVF) = {([Na(+)](p1) + 23.8) (TBW(1)) - ([Na(+)](p2) + 23.8)(TBW(1) + V(MB)) + 1.03 ([E](input) x V(input) - [E](output) x V(output) - [E](urine) (V(input) - V(output) - V(MB)))}/1.03 x [E](urine) where [E] = [Na(+) + K(+)] and input and output refer to non-infusate and non-renal input and output respectively. This new formula is the first quantitative approach for correcting hypervolemic hypernatremia by achieving negative Na(+) and K(+) balance in excess of negative H(2)O balance.

  8. Mixed model phase evolution for correction of magnetic field inhomogeneity effects in 3D quantitative gradient echo-based MRI.

    Science.gov (United States)

    Fatnassi, Chemseddine; Boucenna, Rachid; Zaidi, Habib

    2017-07-01

    In 3D gradient echo magnetic resonance imaging (MRI), strong field gradients B0macro are visually observed at air/tissue interfaces. At low spatial resolution in particular, the respective field gradients lead to an apparent increase in intravoxel dephasing, and subsequently, to signal loss or inaccurate R2* estimates. If the strong field gradients are measured, their influence can be removed by postprocessing. Conventional corrections usually assume a linear phase evolution with time. For high macroscopic gradient inhomogeneities near the edge of the brain and at the paranasal sinuses, however, this assumption is often broken. Herein, we explored a novel model that considers both linear and stochastic dependences of the phase evolution with echo time in the presence of weak and strong macroscopic field inhomogeneities. We tested the performance of the model at large field gradients using simulation, phantom, and human in vivo studies. The performance of the proposed approach was markedly better than the standard correction method, providing a correction equivalent to that of the conventional approach in regions with high signal to noise ratio (SNR > 10), but appearing more robust in regions with low SNR (SNR < 4). The proposed technique shows promise to improve R2* measurements in regions of large susceptibilities. The clinical and research applications still require further investigation. © 2017 American Association of Physicists in Medicine.

  9. Satellite-based monitoring of air quality within QUITSAT project

    Science.gov (United States)

    di Nicolantonio, W.

    2009-04-01

    Satellite remote sensing of both trace gas constituents and Particulate Matter (PM) can be profitably exploited in Air Quality (AQ) assessment. The actual potential role of satellite observations is here highlighted combined with regional meteorological and Chemical Transport Models (CTM) in the context of air quality monitoring as experienced in QUITSAT Project over Northern Italy (from 43:09 to 46:39 N, from 6:19 to 14:23 E). QUITSAT (2006-2009) is a pilot project funded by the Italian Space Agency (ASI) in the framework of its institutional priorities for the Natural and Technological disaster management programme. AQ monitoring is in general based on local ground measurements. In recent years, this issue has been inserted in a more extended frame, in which CTM have joined ground-based data and satellite observations to provide a better characterization of AQ monitoring, forecasting and planning on a regional scale. In particular, two satellite-based products arisen from analysis methodologies developed in QUITSAT and relative to significant pollutants as PM2.5 and NO2 are presented within this work. The MODIS sensors capability (Terra and Aqua/NASA platforms) to retrieve Aerosol Optical Properties (AOP) has been used in a semi-empirical approach to estimate PM2.5 content at the ground. At first, PM2.5 concentration sampled in several sites over Northern Italy are employed in order to infer AOP to PM conversion parameters. A spatial-temporal coincidence procedure has been performed amongst EO and non-EO data. To take into account the aerosol columnar dispersion and the AOP dependence on the relative humidity (RH) meteorological fields (Planetary Boundary Layer and RH) simulated by MM5 are considered. MODIS aerosol level 2 products (MOD04 and MYD04 collection 5, 10x10 km2 spatial resolution) and PM2.5 samplings performed by Regional Environmental Agencies (ARPA Emilia Romagna and ARPA Lombardia) and carried out over further 6 measurements sites (located in Milano

  10. Atmospheric Correction of Satellite GF-1/WFV Imagery and Quantitative Estimation of Suspended Particulate Matter in the Yangtze Estuary

    Directory of Open Access Journals (Sweden)

    Pei Shang

    2016-11-01

    Full Text Available The Multispectral Wide Field of View (WFV camera on the Chinese GF-1 satellite, launched in 2013, has advantages of high spatial resolution (16 m, short revisit period (4 days and wide scene swath (800 km compared to the Landsat-8/OLI, which make it an ideal means of monitoring spatial-temporal changes of Suspended Particulate Matter (SPM in large estuaries like the Yangtze Estuary. However, a lack of proper atmospheric correction methods has limited its application in water quality assessment. We propose an atmospheric correction method based on a look up table coupled by the atmosphere radiative transfer model (6S and the water semi-empirical radiative transfer (SERT model for inversion of water-leaving reflectance from GF-1 top-of-atmosphere radiance, and then retrieving SPM concentration from water-leaving radiance reflectance of the Yangtze Estuary and its adjacent sea. Results are validated by the Landsat-8/OLI imagery together with autonomous fixed station data, and influences of human activities (e.g., waterway construction and shipping on SPM distribution are analyzed.

  11. Effect of physiological heart rate variability on quantitative T2 measurement with ECG-gated Fast Spin Echo (FSE) sequence and its retrospective correction.

    Science.gov (United States)

    de Roquefeuil, Marion; Vuissoz, Pierre-André; Escanyé, Jean-Marie; Felblinger, Jacques

    2013-11-01

    Quantitative T2 measurement is applied in cardiac Magnetic Resonance Imaging (MRI) for the diagnosis and follow-up of myocardial pathologies. Standard Electrocardiogram (ECG)-gated fast spin echo pulse sequences can be used clinically for T2 assessment, with multiple breath-holds. However, heart rate is subject to physiological variability, which causes repetition time variations and affects the recovery of longitudinal magnetization between TR periods. The bias caused by heart rate variability on quantitative T2 measurements is evaluated for fast spin echo pulse sequence. Its retrospective correction based on an effective TR is proposed. Heart rate variations during breath-holds are provided by the ECG recordings from healthy volunteers. T2 measurements were performed on a phantom with known T2 values, by synchronizing the sequence with the recorded ECG. Cardiac T2 measurements were performed twice on six volunteers. The impact of T1 on T2 is also studied. Maximum error in T2 is 26% for phantoms and 18% for myocardial measurement. It is reduced by the proposed compensation method to 20% for phantoms and 10% for in vivo measurements. Only approximate knowledge of T1 is needed for T2 correction. Heart rate variability may cause a bias in T2 measurement with ECG-gated FSE. It needs to be taken into account to avoid a misleading diagnosis from the measurements. © 2013.

  12. Improving satellite-based precipitation products using data assimilation and remotely-sensed soil moisture

    Science.gov (United States)

    Crow, W. T.

    2010-12-01

    Despite their obvious relationship, relatively little attention has been paid to potential synergism between remotely-sensed surface soil moisture and precipitation products. Recent work in Crow et al. (J. Hydrometeor., 10(1), 199-212, 2009) develops an algorithm for enhancing satellite-based land rainfall products via the assimilation of remotely-sensed surface soil moisture retrievals into a land surface model. As a follow-up to this preliminary work, this presentation will describe the benefits of modifying their original approach to incorporate more complex data assimilation filtering and land surface modeling methodologies. Particular emphasis is placed on alternative data assimilation approaches that allow for a more complex representation of stochastic rainfall errors. Modifications associated with improved 3-day rainfall accumulation estimates are then assembled to create the Soil Moisture Analysis Rainfall Tool (SMART). Results demonstrate that the SMART algorithm is superior to the Crow et al. (2009) baseline algorithm and capable of uniformly improving coarse-scale rainfall accumulation estimates with little risk of degradation. Comparisons with Tropical Rainfall Measurement Mission (TRMM) Precipitation Analysis (TMPA) multi-sensor data products suggest that the introduction of soil moisture information via SMART provides as much coarse-scale rainfall information as thermal satellite observations and more information than gauge-based corrections acquired in lightly-instrumented regions like North Africa

  13. Statistical correction of the Winner's Curse explains replication variability in quantitative trait genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Cameron Palmer

    2017-07-01

    Full Text Available Genome-wide association studies (GWAS have identified hundreds of SNPs responsible for variation in human quantitative traits. However, genome-wide-significant associations often fail to replicate across independent cohorts, in apparent inconsistency with their apparent strong effects in discovery cohorts. This limited success of replication raises pervasive questions about the utility of the GWAS field. We identify all 332 studies of quantitative traits from the NHGRI-EBI GWAS Database with attempted replication. We find that the majority of studies provide insufficient data to evaluate replication rates. The remaining papers replicate significantly worse than expected (p < 10-14, even when adjusting for regression-to-the-mean of effect size between discovery- and replication-cohorts termed the Winner's Curse (p < 10-16. We show this is due in part to misreporting replication cohort-size as a maximum number, rather than per-locus one. In 39 studies accurately reporting per-locus cohort-size for attempted replication of 707 loci in samples with similar ancestry, replication rate matched expectation (predicted 458, observed 457, p = 0.94. In contrast, ancestry differences between replication and discovery (13 studies, 385 loci cause the most highly-powered decile of loci to replicate worse than expected, due to difference in linkage disequilibrium.

  14. Quantitative X-ray fluorescence computed tomography for low-Z samples using an iterative absorption correction algorithm

    Directory of Open Access Journals (Sweden)

    Rong Huang

    2017-05-01

    Full Text Available X-ray fluorescence computed tomography is often used to measure trace element distributions within low-Z samples, using algorithms capable of X-ray absorption correction when sample self-absorption is not negligible. Its reconstruction is more complicated compared to transmission tomography, and therefore not widely used. We describe in this paper a very practical iterative method that uses widely available transmission tomography reconstruction software for fluorescence tomography. With this method, sample self-absorption can be corrected not only for the absorption within the measured layer but also for the absorption by material beyond that layer. By combining tomography with analysis for scanning X-ray fluorescence microscopy, absolute concentrations of trace elements can be obtained. By using widely shared software, we not only minimized the coding, took advantage of computing efficiency of fast Fourier transform in transmission tomography software, but also thereby accessed well-developed data processing tools coming with well-known and reliable software packages. The convergence of the iterations was also carefully studied for fluorescence of different attenuation lengths. As an example, fish eye lenses could provide valuable information about fish life-history and endured environmental conditions. Given the lens’s spherical shape and sometimes the short distance from sample to detector for detecting low concentration trace elements, its tomography data are affected by absorption related to material beyond the measured layer but can be reconstructed well with our method. Fish eye lens tomography results are compared with sliced lens 2D fluorescence mapping with good agreement, and with tomography providing better spatial resolution.

  15. Quantitative X-ray fluorescence computed tomography for low-Z samples using an iterative absorption correction algorithm

    Science.gov (United States)

    Huang, Rong; Limburg, Karin; Rohtla, Mehis

    2017-05-01

    X-ray fluorescence computed tomography is often used to measure trace element distributions within low-Z samples, using algorithms capable of X-ray absorption correction when sample self-absorption is not negligible. Its reconstruction is more complicated compared to transmission tomography, and therefore not widely used. We describe in this paper a very practical iterative method that uses widely available transmission tomography reconstruction software for fluorescence tomography. With this method, sample self-absorption can be corrected not only for the absorption within the measured layer but also for the absorption by material beyond that layer. By combining tomography with analysis for scanning X-ray fluorescence microscopy, absolute concentrations of trace elements can be obtained. By using widely shared software, we not only minimized the coding, took advantage of computing efficiency of fast Fourier transform in transmission tomography software, but also thereby accessed well-developed data processing tools coming with well-known and reliable software packages. The convergence of the iterations was also carefully studied for fluorescence of different attenuation lengths. As an example, fish eye lenses could provide valuable information about fish life-history and endured environmental conditions. Given the lens's spherical shape and sometimes the short distance from sample to detector for detecting low concentration trace elements, its tomography data are affected by absorption related to material beyond the measured layer but can be reconstructed well with our method. Fish eye lens tomography results are compared with sliced lens 2D fluorescence mapping with good agreement, and with tomography providing better spatial resolution.

  16. Groundwater Modelling For Recharge Estimation Using Satellite Based Evapotranspiration

    Science.gov (United States)

    Soheili, Mahmoud; (Tom) Rientjes, T. H. M.; (Christiaan) van der Tol, C.

    2017-04-01

    Groundwater movement is influenced by several factors and processes in the hydrological cycle, from which, recharge is of high relevance. Since the amount of aquifer extractable water directly relates to the recharge amount, estimation of recharge is a perquisite of groundwater resources management. Recharge is highly affected by water loss mechanisms the major of which is actual evapotranspiration (ETa). It is, therefore, essential to have detailed assessment of ETa impact on groundwater recharge. The objective of this study was to evaluate how recharge was affected when satellite-based evapotranspiration was used instead of in-situ based ETa in the Salland area, the Netherlands. The Methodology for Interactive Planning for Water Management (MIPWA) model setup which includes a groundwater model for the northern part of the Netherlands was used for recharge estimation. The Surface Energy Balance Algorithm for Land (SEBAL) based actual evapotranspiration maps from Waterschap Groot Salland were also used. Comparison of SEBAL based ETa estimates with in-situ abased estimates in the Netherlands showed that these SEBAL estimates were not reliable. As such results could not serve for calibrating root zone parameters in the CAPSIM model. The annual cumulative ETa map produced by the model showed that the maximum amount of evapotranspiration occurs in mixed forest areas in the northeast and a portion of central parts. Estimates ranged from 579 mm to a minimum of 0 mm in the highest elevated areas with woody vegetation in the southeast of the region. Variations in mean seasonal hydraulic head and groundwater level for each layer showed that the hydraulic gradient follows elevation in the Salland area from southeast (maximum) to northwest (minimum) of the region which depicts the groundwater flow direction. The mean seasonal water balance in CAPSIM part was evaluated to represent recharge estimation in the first layer. The highest recharge estimated flux was for autumn

  17. Operational Satellite-based Surface Oil Analyses (Invited)

    Science.gov (United States)

    Streett, D.; Warren, C.

    2010-12-01

    During the Deepwater Horizon spill, NOAA imagery analysts in the Satellite Analysis Branch (SAB) issued more than 300 near-real-time satellite-based oil spill analyses. These analyses were used by the oil spill response community for planning, issuing surface oil trajectories and tasking assets (e.g., oil containment booms, skimmers, overflights). SAB analysts used both Synthetic Aperture Radar (SAR) and high resolution visible/near IR multispectral satellite imagery as well as a variety of ancillary datasets. Satellite imagery used included ENVISAT ASAR (ESA), TerraSAR-X (DLR), Cosmo-Skymed (ASI), ALOS (JAXA), Radarsat (MDA), ENVISAT MERIS (ESA), SPOT (SPOT Image Corp.), Aster (NASA), MODIS (NASA), and AVHRR (NOAA). Ancillary datasets included ocean current information, wind information, location of natural oil seeps and a variety of in situ oil observations. The analyses were available as jpegs, pdfs, shapefiles and through Google, KML files and also available on a variety of websites including Geoplatform and ERMA. From the very first analysis issued just 5 hours after the rig sank through the final analysis issued in August, the complete archive is still publicly available on the NOAA/NESDIS website http://www.ssd.noaa.gov/PS/MPS/deepwater.html SAB personnel also served as the Deepwater Horizon International Disaster Charter Project Manager (at the official request of the USGS). The Project Manager’s primary responsibility was to acquire and oversee the processing and dissemination of satellite data generously donated by numerous private companies and nations in support of the oil spill response including some of the imagery described above. SAB has begun to address a number of goals that will improve our routine oil spill response as well as help assure that we are ready for the next spill of national significance. We hope to (1) secure a steady, abundant and timely stream of suitable satellite imagery even in the absence of large-scale emergencies such as

  18. Implementation of National Satellite Based Data Archive (NASABADA) in Turkey

    Science.gov (United States)

    Gökdemir, Orhan; Beşer, Özgür; Sürer, Serdar

    2010-05-01

    NASABADA is a unique platform aiming to develop main geographical information system and remote sensing layers that are often used in areas such as: agriculture, forest, climate, hydrology, transportation, meteorology, and energy. Its establishment has started at the second half of the year 2009 in Turkey by Beray Engineering Company, and focused on especially Turkey domain as an individual study area. In general, examples of the satellite-based data-oriented production are usually on global scale and and focuses on a specific satellite. However, the products in NASABADA are consistent with Turkey's geographical conditions, and they are supported by ground information and a time series evaluation as well. Moreover, while developing those algorithms priority is using the national resources and providing a know-how for national information infrastructure. Using different features of satellite data and blending them with ground data to develop and provide the results to the end users of these products is one of the main goals of NASABADA Project. In the first stage of NASABADA, development of 7 products is planned, but the number of products is aimed to be around 20 in the future. The explanation of the pioneering 7 products which their preliminary versions would be published in the near future are as follows: albedo, snow cover, snow water equivalent, cloud, surface temperature, vegetation indices, and daily sun radiation maps. Unique architectural design, algorithms including fuzzy logic and ANN methods have been used for image processing and automatic analysis of large amounts of data on a high-tec file and web servers hardware infrastructure. The final aim of NASABADA is developing a data infrastructure for optimal access to those huge amounts of observational data by end users with tools available to make online processing of data and only gathering required images other than raw data. We discuss the development of the NASABADA data infrastructure, its current

  19. Goddard Satellite-Based Surface Turbulent Fluxes Climatology, Yearly Grid V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  20. Goddard Satellite-Based Surface Turbulent Fluxes Climatology, Monthly Grid V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  1. Goddard Satellite-Based Surface Turbulent Fluxes, Daily Grid F10 V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  2. Goddard Satellite-Based Surface Turbulent Fluxes Climatology, Seasonal Grid V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  3. Goddard Satellite-Based Surface Turbulent Fluxes, Daily Grid F08 V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  4. Quantitation of regional cerebral blood flow corrected for partial volume effect using O-15 water and PET: I. Theory, error analysis, and stereologic comparison

    DEFF Research Database (Denmark)

    Lida, H; Law, I; Pakkenberg, B

    2000-01-01

    Limited spatial resolution of positron emission tomography (PET) can cause significant underestimation in the observed regional radioactivity concentration (so-called partial volume effect or PVE) resulting in systematic errors in estimating quantitative physiologic parameters. The authors have...... in the corresponding ROIs. Reproducibility of the estimated parameters and sensitivity to various error sources were also evaluated. All models tested in the current study yielded PVE-corrected regional CBF values (approximately 0.8 mL x min(-1) x g(-1) for models with a term for gray matter tissue and 0.5 mL x min(-1...... that included two parallel tissue compartments demonstrated better results with regards to the agreement of tissue time-activity curve and the Akaike's Information Criteria. Error sensitivity analysis suggested the model that fits three parameters of the gray matter CBF, the gray matter fraction, and the white...

  5. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    Science.gov (United States)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  6. Quantitative Evaluation of Atlas-based Attenuation Correction for Brain PET in an Integrated Time-of-Flight PET/MR Imaging System.

    Science.gov (United States)

    Yang, Jaewon; Jian, Yiqiang; Jenkins, Nathaniel; Behr, Spencer C; Hope, Thomas A; Larson, Peder E Z; Vigneron, Daniel; Seo, Youngho

    2017-07-01

    Purpose To assess the patient-dependent accuracy of atlas-based attenuation correction (ATAC) for brain positron emission tomography (PET) in an integrated time-of-flight (TOF) PET/magnetic resonance (MR) imaging system. Materials and Methods Thirty recruited patients provided informed consent in this institutional review board-approved study. All patients underwent whole-body fluorodeoxyglucose PET/computed tomography (CT) followed by TOF PET/MR imaging. With use of TOF PET data, PET images were reconstructed with four different attenuation correction (AC) methods: PET with patient CT-based AC (CTAC), PET with ATAC (air and bone from an atlas), PET with ATACpatientBone (air and tissue from the atlas with patient bone), and PET with ATACboneless (air and tissue from the atlas without bone). For quantitative evaluation, PET mean activity concentration values were measured in 14 1-mL volumes of interest (VOIs) distributed throughout the brain and statistical significance was tested with a paired t test. Results The mean overall difference (±standard deviation) of PET with ATAC compared with PET with CTAC was -0.69 kBq/mL ± 0.60 (-4.0% ± 3.2) (P PET with ATACboneless (-9.4% ± 3.7) was significantly worse than that of PET with ATAC (-4.0% ± 3.2) (P PET with ATACpatientBone (-1.5% ± 1.5) improved over that of PET with ATAC (-4.0% ± 3.2) (P PET/MR imaging achieves similar quantification accuracy to that from CTAC by means of atlas-based bone compensation. However, patient-specific anatomic differences from the atlas causes bone attenuation differences and misclassified sinuses, which result in patient-dependent performance variation of ATAC. © RSNA, 2017 Online supplemental material is available for this article.

  7. Whole-Body PET/MR Imaging: Quantitative Evaluation of a Novel Model-Based MR Attenuation Correction Method Including Bone.

    Science.gov (United States)

    Paulus, Daniel H; Quick, Harald H; Geppert, Christian; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Faul, David; Boada, Fernando; Friedman, Kent P; Koesters, Thomas

    2015-07-01

    In routine whole-body PET/MR hybrid imaging, attenuation correction (AC) is usually performed by segmentation methods based on a Dixon MR sequence providing up to 4 different tissue classes. Because of the lack of bone information with the Dixon-based MR sequence, bone is currently considered as soft tissue. Thus, the aim of this study was to evaluate a novel model-based AC method that considers bone in whole-body PET/MR imaging. The new method ("Model") is based on a regular 4-compartment segmentation from a Dixon sequence ("Dixon"). Bone information is added using a model-based bone segmentation algorithm, which includes a set of prealigned MR image and bone mask pairs for each major body bone individually. Model was quantitatively evaluated on 20 patients who underwent whole-body PET/MR imaging. As a standard of reference, CT-based μ-maps were generated for each patient individually by nonrigid registration to the MR images based on PET/CT data. This step allowed for a quantitative comparison of all μ-maps based on a single PET emission raw dataset of the PET/MR system. Volumes of interest were drawn on normal tissue, soft-tissue lesions, and bone lesions; standardized uptake values were quantitatively compared. In soft-tissue regions with background uptake, the average bias of SUVs in background volumes of interest was 2.4% ± 2.5% and 2.7% ± 2.7% for Dixon and Model, respectively, compared with CT-based AC. For bony tissue, the -25.5% ± 7.9% underestimation observed with Dixon was reduced to -4.9% ± 6.7% with Model. In bone lesions, the average underestimation was -7.4% ± 5.3% and -2.9% ± 5.8% for Dixon and Model, respectively. For soft-tissue lesions, the biases were 5.1% ± 5.1% for Dixon and 5.2% ± 5.2% for Model. The novel MR-based AC method for whole-body PET/MR imaging, combining Dixon-based soft-tissue segmentation and model-based bone estimation, improves PET quantification in whole-body hybrid PET/MR imaging, especially in bony tissue and

  8. Using R for analysing spatio-temporal datasets: a satellite-based precipitation case study

    Science.gov (United States)

    Zambrano-Bigiarini, Mauricio

    2017-04-01

    Increasing computer power and the availability of remote-sensing data measuring different environmental variables has led to unprecedented opportunities for Earth sciences in recent decades. However, dealing with hundred or thousands of files, usually in different vectorial and raster formats and measured with different temporal frequencies, impose high computation challenges to take full advantage of all the available data. R is a language and environment for statistical computing and graphics which includes several functions for data manipulation, calculation and graphical display, which are particularly well suited for Earth sciences. In this work I describe how R was used to exhaustively evaluate seven state-of-the-art satellite-based rainfall estimates (SRE) products (TMPA 3B42v7, CHIRPSv2, CMORPH, PERSIANN-CDR, PERSIAN-CCS-adj, MSWEPv1.1 and PGFv3) over the complex topography and diverse climatic gradients of Chile. First, built-in functions were used to automatically download the satellite-images in different raster formats and spatial resolutions and to clip them into the Chilean spatial extent if necessary. Second, the raster package was used to read, plot, and conduct an exploratory data analysis in selected files of each SRE product, in order to detect unexpected problems (rotated spatial domains, order or variables in NetCDF files, etc). Third, raster was used along with the hydroTSM package to aggregate SRE files into different temporal scales (daily, monthly, seasonal, annual). Finally, the hydroTSM and hydroGOF packages were used to carry out a point-to-pixel comparison between precipitation time series measured at 366 stations and the corresponding grid cell of each SRE. The modified Kling-Gupta index of model performance was used to identify possible sources of systematic errors in each SRE, while five categorical indices (PC, POD, FAR, ETS, fBIAS) were used to assess the ability of each SRE to correctly identify different precipitation intensities

  9. Correctional Education and the Reduction of Recidivism: A Quantitative Study of Offenders' Educational Attainment and Success upon Re-Entry into Society

    Science.gov (United States)

    Tanguay, Daniel T.

    2014-01-01

    Research has shown correctional education has always been associated with prison reform from the early years of Pennsylvania's Eastern State Penitentiary to the modern correctional systems of today. However, as a result of increased prison populations and costs, correctional education leadership has been challenged to validate the need for these…

  10. Satellite-Based actual evapotranspiration over drying semiarid terrain in West-Africa

    NARCIS (Netherlands)

    Schuttemeyer, D.; Schillings, Ch.; Moene, A.F.; Bruin, de H.A.R.

    2007-01-01

    A simple satellite-based algorithm for estimating actual evaporation based on Makkink¿s equation is applied to a seasonal cycle in 2002 at three test sites in Ghana, West Africa: at a location in the humid tropical southern region and two in the drier northern region. The required input for the

  11. Estimating crop yield using a satellite-based light use efficiency model

    DEFF Research Database (Denmark)

    Yuan, Wenping; Chen, Yang; Xia, Jiangzhou

    2016-01-01

    Satellite-based techniques that provide temporally and spatially continuous information over vegetated surfaces have become increasingly important in monitoring the global agriculture yield. In this study, we examine the performance of a light use efficiency model (EC-LUE) for simulating the gross...

  12. Beat-to-beat respiratory motion correction with near 100% efficiency: a quantitative assessment using high-resolution coronary artery imaging☆

    Science.gov (United States)

    Scott, Andrew D.; Keegan, Jennifer; Firmin, David N.

    2011-01-01

    This study quantitatively assesses the effectiveness of retrospective beat-to-beat respiratory motion correction (B2B-RMC) at near 100% efficiency using high-resolution coronary artery imaging. Three-dimensional (3D) spiral images were obtained in a coronary respiratory motion phantom with B2B-RMC and navigator gating. In vivo, targeted 3D coronary imaging was performed in 10 healthy subjects using B2B-RMC spiral and navigator gated balanced steady-state free-precession (nav-bSSFP) techniques. Vessel diameter and sharpness in proximal and mid arteries were used as a measure of respiratory motion compensation effectiveness and compared between techniques. Phantom acquisitions with B2B-RMC were sharper than those acquired with navigator gating (B2B-RMC vs. navigator gating: 1.01±0.02 mm−1 vs. 0.86±0.08 mm−1, PB2B-RMC respiratory efficiency was significantly and substantially higher (99.7%±0.5%) than nav-bSSFP (44.0%±8.9%, PB2B-RMC vs. nav-bSSFP, proximal: 1.00±0.14 mm−1 vs. 1.08±0.11 mm−1, mid: 1.01±0.11 mm−1 vs. 1.05±0.12 mm−1; both P=not significant [ns]). Mid vessel diameters were not significantly different (2.85±0.39 mm vs. 2.80±0.35 mm, P=ns), but proximal B2B-RMC diameters were slightly higher (2.85±0.38 mm vs. 2.70±0.34 mm, PB2B-RMC is less variable and significantly higher than navigator gating. Phantom and in vivo vessel sharpness and diameter values suggest that respiratory motion compensation is equally effective. PMID:21292418

  13. Evaluation of the GPM IMERG satellite-based precipitation products and the hydrological utility

    Science.gov (United States)

    Wang, Zhaoli; Zhong, Ruida; Lai, Chengguang; Chen, Jiachao

    2017-11-01

    Pre-occupation evaluation of latest generation satellite-based precipitation products (SPPs) is an essential step before the massive scale use. Taking the Beijiang River Basin as the case study, we used nine statistical evaluation indices and the Variable Infiltration Capacity (VIC) distributed hydrological model to quantitatively evaluate the performance and the hydrological utility of three Global Precipitation Measurement (GPM) Integrated Multi-satellitE Retrievals for GPM (IMERG) products: the near-real-time ;Early; run and ;Late; run IMERG products (IMERG-E and IMERG-L), and the post-real-time ;Final; run IMERG product (IMERG-F) over south China during 2014-2015, with the last-generation Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42-V7 product as comparison. The IMERG-F presents satisfactory accuracy with high correlation coefficient (CC = 0.63) and low relative bias (0.92%), while the IMERG-E and IMERG-L performs relatively poorly featuring low correlation (with CC of 0.49 and 0.52 respectively) with the ground observations. All of the three IMERG products present apparently higher probability of detection (POD, 0.64-0.67) but have higher false alarm ratio (FAR, ≧ 0.14) than the 3B42-V7. The hydrological simulation under scenario I (model calibrated by the gauge observations) shows that, the IMERG-F, with a high Nash-Sutcliffe coefficient of efficiency (NSCE) of 0.742, presents better hydrological performance than the 3B42-V7; the IMERG-E and IMERG-L perform poorly for the whole simulation period with NSCE lower than 0.35 and relative bias higher than 28% while perform satisfactorily during the flood season with apparently higher NSCE of 0.750 and 0.733 respectively. The hydrological simulation under scenario II (model calibrated by the 3B42-V7) shows that the performance of all the IMERG products was significantly improved. Generally, the IMERG-F has high accuracy and good hydrological utility, while the

  14. CT-based attenuation correction in the calculation of semi-quantitative indices of [{sup 18}F]FDG uptake in PET

    Energy Technology Data Exchange (ETDEWEB)

    Visvikis, D.; Costa, D.C.; Croasdale, I.; Bomanji, J.; Gacinovic, S.; Ell, P.J. [Institute of Nuclear Medicine, Royal Free and University College Medical School, Middlesex Hospital, Mortimer Street, W1T 3AA, London (United Kingdom); Lonn, A.H.R. [GE Medical Systems, Slough (United Kingdom)

    2003-03-01

    The introduction of combined PET/CT systems has a number of advantages, including the utilisation of CT images for PET attenuation correction (AC). The potential advantage compared with existing methodology is less noisy transmission maps within shorter times of acquisition. The objective of our investigation was to assess the accuracy of CT attenuation correction (CTAC) and to study resulting bias and signal to noise ratio (SNR) in image-derived semi-quantitative uptake indices. A combined PET/CT system (GE Discovery LS) was used. Different size phantoms containing variable density components were used to assess the inherent accuracy of a bilinear transformation in the conversion of CT images to 511 keV attenuation maps. This was followed by a phantom study simulating tumour imaging conditions, with a tumour to background ratio of 5:1. An additional variable was the inclusion of contrast agent at different concentration levels. A CT scan was carried out followed by 5 min emission with 1-h and 3-min transmission frames. Clinical data were acquired in 50 patients, who had a CT scan under normal breathing conditions (CTAC{sub nb}) or under breath-hold with inspiration (CTAC{sub insp}) or expiration (CTAC{sub exp}), followed by a PET scan of 5 and 3 min per bed position for the emission and transmission scans respectively. Phantom and patient studies were reconstructed using segmented AC (SAC) and CTAC. In addition, measured AC (MAC) was performed for the phantom study using the 1-h transmission frame. Comparing the attenuation coefficients obtained using the CT- and the rod source-based attenuation maps, differences of 3% and <6% were recorded before and after segmentation of the measured transmission maps. Differences of up to 6% and 8% were found in the average count density (SUV{sub avg}) between the phantom images reconstructed with MAC and those reconstructed with CTAC and SAC respectively. In the case of CTAC, the difference increased up to 27% with the

  15. Beat-to-beat respiratory motion correction with near 100% efficiency: a quantitative assessment using high-resolution coronary artery imaging.

    Science.gov (United States)

    Scott, Andrew D; Keegan, Jennifer; Firmin, David N

    2011-05-01

    This study quantitatively assesses the effectiveness of retrospective beat-to-beat respiratory motion correction (B2B-RMC) at near 100% efficiency using high-resolution coronary artery imaging. Three-dimensional (3D) spiral images were obtained in a coronary respiratory motion phantom with B2B-RMC and navigator gating. In vivo, targeted 3D coronary imaging was performed in 10 healthy subjects using B2B-RMC spiral and navigator gated balanced steady-state free-precession (nav-bSSFP) techniques. Vessel diameter and sharpness in proximal and mid arteries were used as a measure of respiratory motion compensation effectiveness and compared between techniques. Phantom acquisitions with B2B-RMC were sharper than those acquired with navigator gating (B2B-RMC vs. navigator gating: 1.01±0.02 mm(-1) vs. 0.86±0.08 mm(-1), PB2B-RMC respiratory efficiency was significantly and substantially higher (99.7%±0.5%) than nav-bSSFP (44.0%±8.9%, PB2B-RMC vs. nav-bSSFP, proximal: 1.00±0.14 mm(-1) vs. 1.08±0.11 mm(-1), mid: 1.01±0.11 mm(-1) vs. 1.05±0.12 mm(-1); both P=not significant [ns]). Mid vessel diameters were not significantly different (2.85±0.39 mm vs. 2.80±0.35 mm, P=ns), but proximal B2B-RMC diameters were slightly higher (2.85±0.38 mm vs. 2.70±0.34 mm, PB2B-RMC is less variable and significantly higher than navigator gating. Phantom and in vivo vessel sharpness and diameter values suggest that respiratory motion compensation is equally effective. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. B1 mapping for bias-correction in quantitative T1 imaging of the brain at 3T using standard pulse sequences.

    Science.gov (United States)

    Boudreau, Mathieu; Tardif, Christine L; Stikov, Nikola; Sled, John G; Lee, Wayne; Pike, G Bruce

    2017-12-01

    B1 mapping is important for many quantitative imaging protocols, particularly those that include whole-brain T1 mapping using the variable flip angle (VFA) technique. However, B1 mapping sequences are not typically available on many magnetic resonance imaging (MRI) scanners. The aim of this work was to demonstrate that B1 mapping implemented using standard scanner product pulse sequences can produce B1 (and VFA T1 ) maps comparable in quality and acquisition time to advanced techniques. Six healthy subjects were scanned at 3.0T. An interleaved multislice spin-echo echo planar imaging double-angle (EPI-DA) B1 mapping protocol, using a standard product pulse sequence, was compared to two alternative methods (actual flip angle imaging, AFI, and Bloch-Siegert shift, BS). Single-slice spin-echo DA B1 maps were used as a reference for comparison (Ref. DA). VFA flip angles were scaled using each B1 map prior to fitting T1 ; the nominal flip angle case was also compared. The pooled-subject voxelwise correlation (ρ) for B1 maps (BS/AFI/EPI-DA) relative to the reference B1 scan (Ref. DA) were ρ = 0.92/0.95/0.98. VFA T1 correlations using these maps were ρ = 0.86/0.88/0.96, much better than without B1 correction (ρ = 0.53). The relative error for each B1 map (BS/AFI/EPI-DA/Nominal) had 95(th) percentiles of 5/4/3/13%. Our findings show that B1 mapping implemented using product pulse sequences can provide excellent quality B1 (and VFA T1 ) maps, comparable to other custom techniques. This fast whole-brain measurement (∼2 min) can serve as an excellent alternative for researchers without access to advanced B1 pulse sequences. 1 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1673-1682. © 2017 International Society for Magnetic Resonance in Medicine.

  17. Implementing earth observation and advanced satellite based atmospheric sounders for water resource and climate modelling

    DEFF Research Database (Denmark)

    Boegh, E.; Dellwik, Ebba; Hahmann, Andrea N.

    for effective land surface representation in water resource modeling” (2009- 2012). The purpose of the new research project is to develop remote sensing based model tools capable of quantifying the relative effects of site-specific land use change and climate variability at different spatial scales......This paper discusses preliminary remote sensing (MODIS) based hydrological modelling results for the Danish island Sjælland (7330 km2) in relation to project objectives and methodologies of a new research project “Implementing Earth observation and advanced satellite based atmospheric sounders....... For this purpose, a) internal catchment processes will be studied using a Distributed Temperature Sensing (DTS) system, b) Earth observations will be used to upscale from field to regional scales, and c) at the largest scale, satellite based atmospheric sounders and meso-scale climate modelling will be used...

  18. Development and validation of satellite-based estimates of surface visibility

    OpenAIRE

    Brunner, J.; R. B. Pierce; A. Lenzen

    2016-01-01

    A satellite-based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weat...

  19. Simulation of large-scale soil water systems using groundwater data and satellite based soil moisture

    Science.gov (United States)

    Kreye, Phillip; Meon, Günter

    2016-04-01

    Complex concepts for the physically correct depiction of dominant processes in the hydrosphere are increasingly at the forefront of hydrological modelling. Many scientific issues in hydrological modelling demand for additional system variables besides a simulation of runoff only, such as groundwater recharge or soil moisture conditions. Models that include soil water simulations are either very simplified or require a high number of parameters. Against this backdrop there is a heightened demand of observations to be used to calibrate the model. A reasonable integration of groundwater data or remote sensing data in calibration procedures as well as the identifiability of physically plausible sets of parameters is subject to research in the field of hydrology. Since this data is often combined with conceptual models, the given interfaces are not suitable for such demands. Furthermore, the application of automated optimisation procedures is generally associated with conceptual models, whose (fast) computing times allow many iterations of the optimisation in an acceptable time frame. One of the main aims of this study is to reduce the discrepancy between scientific and practical applications in the field of hydrological modelling. Therefore, the soil model DYVESOM (DYnamic VEgetation SOil Model) was developed as one of the primary components of the hydrological modelling system PANTA RHEI. DYVESOMs structure provides the required interfaces for the calibrations made at runoff, satellite based soil moisture and groundwater level. The model considers spatial and temporal differentiated feedback of the development of the vegetation on the soil system. In addition, small scale heterogeneities of soil properties (subgrid-variability) are parameterized by variation of van Genuchten parameters depending on distribution functions. Different sets of parameters are operated simultaneously while interacting with each other. The developed soil model is innovative regarding concept

  20. Source mass eruption rate retrieved from satellite-based data using statistical modelling

    Science.gov (United States)

    Gouhier, Mathieu; Guillin, Arnaud; Azzaoui, Nourddine; Eychenne, Julia; Valade, Sébastien

    2015-04-01

    Ash clouds emitted during volcanic eruptions have long been recognized as a major hazard likely to have dramatic consequences on aircrafts, environment and people. Thus, the International Civil Aviation Organization (ICAO) established nine Volcanic Ash Advisory Centers (VAACs) around the world, whose mission is to forecast the location and concentration of ash clouds over hours to days, using volcanic ash transport and dispersion models (VATDs). Those models use input parameters such as plume height (PH), particle size distribution (PSD), and mass eruption rate (MER), the latter being a key parameter as it directly controls the amount of ash injected into the atmosphere. The MER can be obtained rather accurately from detailed ground deposit studies, but this method does not match the operational requirements in case of a volcanic crisis. Thus, VAACs use empirical laws to determine the MER from the estimation of the plume height. In some cases, this method can be difficult to apply, either because plume height data are not available or because uncertainties related to this method are too large. We propose here an alternative method based on the utilization of satellite data to assess the MER at the source, during explosive eruptions. Satellite-based techniques allow fine ash cloud loading to be quantitatively retrieved far from the source vent. Those measurements can be carried out in a systematic and real-time fashion using geostationary satellite, in particular. We tested here the relationship likely to exist between the amount of fine ash dispersed in the atmosphere and of coarser tephra deposited on the ground. The sum of both contributions yielding an estimate of the MER. For this purpose we examined 19 eruptions (of known duration) in detail for which both (i) the amount of fine ash dispersed in the atmosphere, and (ii) the mass of tephra deposited on the ground have been estimated and published. We combined these data with contextual information that may

  1. Education and Public Outreach for the PICASSO-CENA Satellite-Based Research Mission: K-12 Students Use Sun Photometers to Assist Scientists in Validating Atmospheric Data

    Science.gov (United States)

    Robinson, D. Q.

    2001-05-01

    Hampton University, a historically black university, is leading the Education and Public Outreach (EPO) portion of the PICASSO-CENA satellite-based research mission. Currently scheduled for launch in 2004, PICASSO-CENA will use LIDAR (LIght Detection and Ranging), to study earth's atmosphere. The PICASSO-CENA Outreach program works with scientists, teachers, and students to better understand the effects of clouds and aerosols on earth's atmosphere. This program actively involves students nationwide in NASA research by having them obtain sun photometer measurements from their schools and homes for comparison with data collected by the PICASSO-CENA mission. Students collect data from their classroom ground observations and report the data via the Internet. Scientists will use the data from the PICASSO-CENA research and the student ground-truthing observations to improve predications about climatic change. The two-band passive remote sensing sun photometer is designed for student use as a stand alone instrument to study atmospheric turbidity or in conjunction with satellite data to provide ground-truthing. The instrument will collect measurements of column optical depth from the ground level. These measurements will not only give the students an appreciation for atmospheric turbidity, but will also provide quantitative correlative information to the PICASSO-CENA mission on ground-level optical depth. Student data obtained in this manner will be sufficiently accurate for scientists to use as ground truthing. Thus, students will have the opportunity to be involved with a NASA satellite-based research mission.

  2. Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 4. Operational Description and Qualitative Assessment.

    Science.gov (United States)

    1974-02-01

    The volume presents a description of how the Satellite-Based Advanced Air Traffic Management System (SAATMS) operates and a qualitative assessment of the system. The operational description includes the services, functions, and tasks performed by the...

  3. Using satellite-based measurements to explore spatiotemporal scales and variability of drivers of new particle formation

    Science.gov (United States)

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently ...

  4. Differences in estimating terrestrial water flux from three satellite-based Priestley-Taylor algorithms

    Science.gov (United States)

    Yao, Yunjun; Liang, Shunlin; Yu, Jian; Zhao, Shaohua; Lin, Yi; Jia, Kun; Zhang, Xiaotong; Cheng, Jie; Xie, Xianhong; Sun, Liang; Wang, Xuanyu; Zhang, Lilin

    2017-04-01

    Accurate estimates of terrestrial latent heat of evaporation (LE) for different biomes are essential to assess energy, water and carbon cycles. Different satellite- based Priestley-Taylor (PT) algorithms have been developed to estimate LE in different biomes. However, there are still large uncertainties in LE estimates for different PT algorithms. In this study, we evaluated differences in estimating terrestrial water flux in different biomes from three satellite-based PT algorithms using ground-observed data from eight eddy covariance (EC) flux towers of China. The results reveal that large differences in daily LE estimates exist based on EC measurements using three PT algorithms among eight ecosystem types. At the forest (CBS) site, all algorithms demonstrate high performance with low root mean square error (RMSE) (less than 16 W/m2) and high squared correlation coefficient (R2) (more than 0.9). At the village (HHV) site, the ATI-PT algorithm has the lowest RMSE (13.9 W/m2), with bias of 2.7 W/m2 and R2 of 0.66. At the irrigated crop (HHM) site, almost all models algorithms underestimate LE, indicating these algorithms may not capture wet soil evaporation by parameterization of the soil moisture. In contrast, the SM-PT algorithm shows high values of R2 (comparable to those of ATI-PT and VPD-PT) at most other (grass, wetland, desert and Gobi) biomes. There are no obvious differences in seasonal LE estimation using MODIS NDVI and LAI at most sites. However, all meteorological or satellite-based water-related parameters used in the PT algorithm have uncertainties for optimizing water constraints. This analysis highlights the need to improve PT algorithms with regard to water constraints.

  5. Quantitative atom column position analysis at the incommensurate interfaces of a (PbS){sub 1.14}NbS{sub 2} misfit layered compound with aberration-corrected HRTEM

    Energy Technology Data Exchange (ETDEWEB)

    Garbrecht, M., E-mail: mag@technion.ac.il [Microanalysis of Materials, Institute of Materials Science, University of Kiel, 24143 Kiel (Germany); Spiecker, E., E-mail: erdmann.spiecker@ww.uni-erlangen.de [Microanalysis of Materials, Institute of Materials Science, University of Kiel, 24143 Kiel (Germany); Tillmann, K. [Ernst Ruska-Centre and Institute for Solid State Research, Research Centre Juelich GmbH, 52425 Juelich (Germany); Jaeger, W. [Microanalysis of Materials, Institute of Materials Science, University of Kiel, 24143 Kiel (Germany)

    2011-02-15

    Aberration-corrected HRTEM is applied to explore the potential of NCSI contrast imaging to quantitatively analyse the complex atomic structure of misfit layered compounds and their incommensurate interfaces. Using the (PbS){sub 1.14}NbS{sub 2} misfit layered compound as a model system it is shown that atom column position analyses at the incommensurate interfaces can be performed with precisions reaching a statistical accuracy of {+-}6 pm. The procedure adopted for these studies compares experimental images taken from compound regions free of defects and interface modulations with a structure model derived from XRD experiments and with multi-slice image simulations for the corresponding NCSI contrast conditions used. The high precision achievable in such experiments is confirmed by a detailed quantitative analysis of the atom column positions at the incommensurate interfaces, proving a tetragonal distortion of the monochalcogenide sublattice. -- Research Highlights: {yields} Quantitative aberration-corrected HRTEM analysis of atomic column positions in (PbS){sub 1.14}NbS{sub 2} misfit layered compound reveals tetragonal distortion of the PbS subsystem. {yields} Detailed comparison of multi-slice simulations with the experimental NCSI contrast condition imaging results lead to a high precision (better than 10 pm) for determining the positions of atoms. {yields} Precision in gaining information of local structure at atomic scale is demonstrated, which may not be accessible by means of X-ray and neutron diffraction analysis.

  6. A Novel Sampling Method for Satellite-Based Offshore Wind Resource Estimation

    DEFF Research Database (Denmark)

    Badger, Merete; Badger, Jake; Hasager, Charlotte Bay

    advantage of satellite-based offshore wind resource assessment is the spatial information gained at high resolution from the satellite imagery. Limitations include the low sampling rate and the fixed acquisition times of satellite data. A new satellite scene is typically available every 3-4 days over...... wind class is used to weight the satellite-derived winds for the calculation of wind climatology. The applied selection and analysis of images should improve the reliability of wind climate estimates compared to random selection of images each given equal weighting....

  7. Satellite Based Mapping of Ground PM2.5 Concentration Using Generalized Additive Modeling

    OpenAIRE

    Bin Zou; Jingwen Chen; Liang Zhai; Xin Fang; Zhong Zheng

    2016-01-01

    Satellite-based PM2.5 concentration estimation is growing as a popular solution to map the PM2.5 spatial distribution due to the insufficiency of ground-based monitoring stations. However, those applications usually suffer from the simple hypothesis that the influencing factors are linearly correlated with PM2.5 concentrations, though non-linear mechanisms indeed exist in their interactions. Taking the Beijing-Tianjin-Hebei (BTH) region in China as a case, this study developed a generalized a...

  8. Evaluation of Clear Sky Models for Satellite-Based Irradiance Estimates

    Energy Technology Data Exchange (ETDEWEB)

    Sengupta, Manajit [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gotseff, Peter [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    This report describes an intercomparison of three popular broadband clear sky solar irradiance model results with measured data, as well as satellite-based model clear sky results compared to measured clear sky data. The authors conclude that one of the popular clear sky models (the Bird clear sky model developed by Richard Bird and Roland Hulstrom) could serve as a more accurate replacement for current satellite-model clear sky estimations. Additionally, the analysis of the model results with respect to model input parameters indicates that rather than climatological, annual, or monthly mean input data, higher-time-resolution input parameters improve the general clear sky model performance.

  9. Bias in the Cq value observed with hydrolysis probe based quantitative PCR can be corrected with the estimated PCR efficiency value

    NARCIS (Netherlands)

    Tuomi, Jari Michael; Voorbraak, Frans; Jones, Douglas L.; Ruijter, Jan M.

    2010-01-01

    For real-time monitoring of PCR amplification of DNA, quantitative PCR (qPCR) assays use various fluorescent reporters. DNA binding molecules and hybridization reporters (primers and probes) only fluoresce when bound to DNA and result in the non-cumulative increase in observed fluorescence.

  10. Correction for Partial Volume Effect Is a Must, Not a Luxury, to Fully Exploit the Potential of Quantitative PET Imaging in Clinical Oncology

    DEFF Research Database (Denmark)

    Alavi, Abass; Werner, Thomas J; Høilund-Carlsen, Poul Flemming

    2017-01-01

    The partial volume effect (PVE) is considered as one of the major degrading factors impacting image quality and hampering the accuracy of quantitative PET imaging in clinical oncology. This effect is the consequence of the limited spatial resolution of whole-body PET scanners, which results...

  11. Quantitative carotid PET/MR imaging: clinical evaluation of MR-Attenuation correction versus CT-Attenuation correction in 18F-FDG PET/MR emission data and comparison to PET/CT

    OpenAIRE

    Bini, Jason; Robson, Philip M.; Calcagno, Claudia; Eldib, Mootaz; Fayad, Zahi A.

    2015-01-01

    Current PET/MR systems employ segmentation of MR images and subsequent assignment of empirical attenuation coefficients for quantitative PET reconstruction. In this study we examine the differences in the quantification of 18F-FDG uptake in the carotid arteries between PET/MR and PET/CT scanners. Five comparisons were performed to asses differences in PET quantification: i) PET/MR MR-based AC (MRAC) versus PET/MR CTAC, ii) PET/MR MRAC versus PET/CT, iii) PET/MR MRAC with carotid coil versus P...

  12. MT and spillover correction for quantitative steady-state pulsed CEST-MRI at 3T using the reciprocal Z-spectrum

    CERN Document Server

    Zaiss, Moritz; Bachert, Peter

    2013-01-01

    Endogenous chemical exchange saturation transfer (CEST) effects are always diluted by competing effects such as direct water proton saturation (spillover) and macromolecular magnetization transfer (MT). This leads to T2-and MT-shine-through effects in the actual biochemical contrast of CEST. Therefore, a simple evaluation algorithm which corrects the CEST signal was searched for. By employing a recent eigenspace theory valid for spinlock and continuous wave (cw) CEST we predict that the inverse Z-spectrum is beneficial to Z-spectrum itself. Based on this we propose a new spillover- and MT-corrected magnetization transfer ratio (MTRRex) yielding Rex, the exchange dependent relaxation rate in the rotating frame. For verification, the amine proton exchange of creatine in solutions with different agar concentration was studied experimentally at clinical field strength of 3T. In contrast to the compared standard evaluation for pulsed CEST experiments, MTRasym, our approach shows no T2 or MT shine through effect. W...

  13. Satellite-based assessment of yield variation and its determinants in smallholder African systems.

    Science.gov (United States)

    Burke, Marshall; Lobell, David B

    2017-02-28

    The emergence of satellite sensors that can routinely observe millions of individual smallholder farms raises possibilities for monitoring and understanding agricultural productivity in many regions of the world. Here we demonstrate the potential to track smallholder maize yield variation in western Kenya, using a combination of 1-m Terra Bella imagery and intensive field sampling on thousands of fields over 2 y. We find that agreement between satellite-based and traditional field survey-based yield estimates depends significantly on the quality of the field-based measures, with agreement highest ([Formula: see text] up to 0.4) when using precise field measures of plot area and when using larger fields for which rounding errors are smaller. We further show that satellite-based measures are able to detect positive yield responses to fertilizer and hybrid seed inputs and that the inferred responses are statistically indistinguishable from estimates based on survey-based yields. These results suggest that high-resolution satellite imagery can be used to make predictions of smallholder agricultural productivity that are roughly as accurate as the survey-based measures traditionally used in research and policy applications, and they indicate a substantial near-term potential to quickly generate useful datasets on productivity in smallholder systems, even with minimal or no field training data. Such datasets could rapidly accelerate learning about which interventions in smallholder systems have the most positive impact, thus enabling more rapid transformation of rural livelihoods.

  14. An intercomparison and validation of satellite-based surface radiative flux estimates over the Arctic

    Science.gov (United States)

    Riihelä, Aku; Key, Jeffrey; Fokke Meirink, Jan; Kuipers Munneke, Peter; Palo, Timo; Karlsson, Karl-Göran

    2017-04-01

    Accurate determination of radiative energy fluxes over the Arctic is of crucial importance for understanding atmosphere-surface interactions, melt and refreezing cycles of the snow and ice cover, and the role of the Arctic in the global energy budget. Satellite-based estimates can provide comprehensive spatiotemporal coverage, but the accuracy and comparability of the existing datasets must be ascertained to facilitate their use. Here we compare radiative flux estimates from CERES SYN/EBAF, GEWEX SRB and our own experimental Fluxnet-CLARA data against in situ observations over Arctic sea ice and the Greenland Ice Sheet during summer of 2007. In general, CERES SYN1deg flux estimates agree best with in situ measurements, although with two particular limitations. 1) Over sea ice the upwelling shortwave flux in CERES SYN1deg appears to be underestimated because of an underestimated surface albedo. And 2), the CERES SYN1deg upwelling longwave flux over sea ice saturates during midsummer. The AVHRR-based GEWEX and Fluxnet-CLARA flux estimates generally show a larger range in retrieval errors relative to CERES, with contrasting tendencies relative to each other. The largest source of retrieval error in the Fluxnet-CLARA downwelling shortwave flux is shown to be an overestimated cloud optical thickness. The results illustrate that satellite-based flux estimates over the Arctic are not yet homogeneous and further efforts are necessary to investigate the differences in the surface and cloud properties which lead to disagreements in flux retrievals.

  15. Coordinated ground-based and geosynchronous satellite-based measurements of auroral pulsations

    Energy Technology Data Exchange (ETDEWEB)

    Suszcynsky, David M.; Borovsky, Joseph E.; Thomsen, Michelle F.; McComas, David J.; Belian, Richard D.

    1996-09-01

    We describe a technique that uses a ground-based all-sky video camera and geosynchronous satellite-based plasma and energetic particle detectors to study ionosphere-magnetosphere coupling as it relates to the aurora. The video camera system was deployed in Eagle, Alaska for a seven month period at the foot of the magnetic field line that threads geosynchronous satellite 1989-046. Since 1989-046 corotates with the earth, its footprint remains nearly fixed in the vicinity of Eagle, allowing for routine continuous monitoring of an auroral field line at its intersections with the ground and with geosynchronous orbit. As an example of the utility of this technique, we present coordinated ground-based and satellite based observations during periods of auroral pulsations and compare this data to the predictions of both the relaxation oscillator theory and flow cyclotron maser theory for the generation of pulsating aurorae. The observed plasma and energetic particle characteristics at geosynchronous orbit during pulsating aurorae displays are found to be in agreement with the predictions of both theories lending further support that a cyclotron resonance mechanism is responsible for auroral pulsations.

  16. Ecological change, sliding baselines and the importance of historical data: lessons from Combining [corrected] observational and quantitative data on a temperate reef over 70 years.

    Directory of Open Access Journals (Sweden)

    Giulia Gatti

    Full Text Available Understanding the effects of environmental change on ecosystems requires the identification of baselines that may act as reference conditions. However, the continuous change of these references challenges our ability to define the true natural status of ecosystems. The so-called sliding baseline syndrome can be overcome through the analysis of quantitative time series, which are, however, extremely rare. Here we show how combining historical quantitative data with descriptive 'naturalistic' information arranged in a chronological chain allows highlighting long-term trends and can be used to inform present conservation schemes. We analysed the long-term change of a coralligenous reef, a marine habitat endemic to the Mediterranean Sea. The coralligenous assemblages of Mesco Reef (Ligurian Sea, NW Mediterranean have been studied, although discontinuously, since 1937 thus making available both detailed descriptive information and scanty quantitative data: while the former was useful to understand the natural history of the ecosystem, the analysis of the latter was of paramount importance to provide a formal measure of change over time. Epibenthic assemblages remained comparatively stable until the 1990s, when species replacement, invasion by alien algae, and biotic homogenisation occurred within few years, leading to a new and completely different ecosystem state. The shift experienced by the coralligenous assemblages of Mesco Reef was probably induced by a combination of seawater warming and local human pressures, the latter mainly resulting in increased water turbidity; in turn, cumulative stress may have favoured the establishment of alien species. This study showed that the combined analysis of quantitative and descriptive historical data represent a precious knowledge to understand ecosystem trends over time and provide help to identify baselines for ecological management.

  17. Ecological change, sliding baselines and the importance of historical data: lessons from Combining [corrected] observational and quantitative data on a temperate reef over 70 years.

    Science.gov (United States)

    Gatti, Giulia; Bianchi, Carlo Nike; Parravicini, Valeriano; Rovere, Alessio; Peirano, Andrea; Montefalcone, Monica; Massa, Francesco; Morri, Carla

    2015-01-01

    Understanding the effects of environmental change on ecosystems requires the identification of baselines that may act as reference conditions. However, the continuous change of these references challenges our ability to define the true natural status of ecosystems. The so-called sliding baseline syndrome can be overcome through the analysis of quantitative time series, which are, however, extremely rare. Here we show how combining historical quantitative data with descriptive 'naturalistic' information arranged in a chronological chain allows highlighting long-term trends and can be used to inform present conservation schemes. We analysed the long-term change of a coralligenous reef, a marine habitat endemic to the Mediterranean Sea. The coralligenous assemblages of Mesco Reef (Ligurian Sea, NW Mediterranean) have been studied, although discontinuously, since 1937 thus making available both detailed descriptive information and scanty quantitative data: while the former was useful to understand the natural history of the ecosystem, the analysis of the latter was of paramount importance to provide a formal measure of change over time. Epibenthic assemblages remained comparatively stable until the 1990s, when species replacement, invasion by alien algae, and biotic homogenisation occurred within few years, leading to a new and completely different ecosystem state. The shift experienced by the coralligenous assemblages of Mesco Reef was probably induced by a combination of seawater warming and local human pressures, the latter mainly resulting in increased water turbidity; in turn, cumulative stress may have favoured the establishment of alien species. This study showed that the combined analysis of quantitative and descriptive historical data represent a precious knowledge to understand ecosystem trends over time and provide help to identify baselines for ecological management.

  18. Quantitative carotid PET/MR imaging: clinical evaluation of MR-Attenuation correction versus CT-Attenuation correction in (18)F-FDG PET/MR emission data and comparison to PET/CT.

    Science.gov (United States)

    Bini, Jason; Robson, Philip M; Calcagno, Claudia; Eldib, Mootaz; Fayad, Zahi A

    2015-01-01

    Current PET/MR systems employ segmentation of MR images and subsequent assignment of empirical attenuation coefficients for quantitative PET reconstruction. In this study we examine the differences in the quantification of (18)F-FDG uptake in the carotid arteries between PET/MR and PET/CT scanners. Five comparisons were performed to asses differences in PET quantification: i) PET/MR MR-based AC (MRAC) versus PET/MR CTAC, ii) PET/MR MRAC versus PET/CT, iii) PET/MR MRAC with carotid coil versus PET/MR MRAC without coil, iv) PET/MR MRAC scan 2 versus PET/MR MRAC scan 1, and v) PET/MR CTAC versus PET/CT. Standardized uptakes values (SUV) mean and SUV maximum were calculated for six regions-of-interests: left and right carotid arteries, left and right lungs, spine and muscle. Pearson's Correlation and Bland-Altman plots were used to compare SUV mean and maximum within each ROI of each patient. PET/MR emission data reconstructed with MRAC versus PET/MR emission data reconstructed with CTAC had percent differences of SUV mean ranging from -2.0% (Absolute Difference, -0.02) to 7.4% (absolute difference, 0.06). Percent differences within the carotid arteries proved to correlate well with differences of SUV mean of 5.4% (Absolute Difference, 0.07) in the left carotid and 2.7% (Absolute Difference, 0.03) in the right carotid. Pearson's correlation and Bland-Altman of PET/MR with MRAC versus PET/MR with CTAC showed high correlation between SUV mean (R(2)=0.80, mean difference 0.03 ± 0.18 SUV, p=0.3382), demonstrating excellent correlation within ROIs analyzed. The results of this study support the use of (18)F-FDG PET/MR for quantitative measure of inflammation in the carotid arteries.

  19. Quantitative carotid PET/MR imaging: clinical evaluation of MR-Attenuation correction versus CT-Attenuation correction in 18F-FDG PET/MR emission data and comparison to PET/CT

    Science.gov (United States)

    Bini, Jason; Robson, Philip M; Calcagno, Claudia; Eldib, Mootaz; Fayad, Zahi A

    2015-01-01

    Current PET/MR systems employ segmentation of MR images and subsequent assignment of empirical attenuation coefficients for quantitative PET reconstruction. In this study we examine the differences in the quantification of 18F-FDG uptake in the carotid arteries between PET/MR and PET/CT scanners. Five comparisons were performed to asses differences in PET quantification: i) PET/MR MR-based AC (MRAC) versus PET/MR CTAC, ii) PET/MR MRAC versus PET/CT, iii) PET/MR MRAC with carotid coil versus PET/MR MRAC without coil, iv) PET/MR MRAC scan 2 versus PET/MR MRAC scan 1, and v) PET/MR CTAC versus PET/CT. Standardized uptakes values (SUV) mean and SUV maximum were calculated for six regions-of-interests: left and right carotid arteries, left and right lungs, spine and muscle. Pearson’s Correlation and Bland-Altman plots were used to compare SUV mean and maximum within each ROI of each patient. PET/MR emission data reconstructed with MRAC versus PET/MR emission data reconstructed with CTAC had percent differences of SUV mean ranging from -2.0% (Absolute Difference, -0.02) to 7.4% (absolute difference, 0.06). Percent differences within the carotid arteries proved to correlate well with differences of SUV mean of 5.4% (Absolute Difference, 0.07) in the left carotid and 2.7% (Absolute Difference, 0.03) in the right carotid. Pearson’s correlation and Bland-Altman of PET/MR with MRAC versus PET/MR with CTAC showed high correlation between SUV mean (R2=0.80, mean difference 0.03 ± 0.18 SUV, p=0.3382), demonstrating excellent correlation within ROIs analyzed. The results of this study support the use of 18F-FDG PET/MR for quantitative measure of inflammation in the carotid arteries. PMID:26069863

  20. Quantitative evaluation of automated skull-stripping methods applied to contemporary and legacy images: effects of diagnosis, bias correction, and slice location

    DEFF Research Database (Denmark)

    Fennema-Notestine, Christine; Ozyurt, I Burak; Clark, Camellia P

    2006-01-01

    methods: Brain Extraction Tool (BET; Smith [2002]: Hum Brain Mapp 17:143-155); 3dIntracranial (Ward [1999] Milwaukee: Biophysics Research Institute, Medical College of Wisconsin; in AFNI); a Hybrid Watershed algorithm (HWA, Segonne et al. [2004] Neuroimage 22:1060-1075; in FreeSurfer); and Brain Surface...... Extractor (BSE, Sandor and Leahy [1997] IEEE Trans Med Imag 16:41-54; Shattuck et al. [2001] Neuroimage 13:856-876) to manually stripped images. The methods were applied to uncorrected and bias-corrected datasets; Legacy and Contemporary T1-weighted image sets; and four diagnostic groups (depressed...

  1. Quantitative defect in staphylococcal enterotoxin A binding and presentation by HLA-DM-deficient T2.Ak cells corrected by transfection of HLA-DM genes.

    Science.gov (United States)

    Albert, L J; Denzin, L K; Ghumman, B; Bangia, N; Cresswell, P; Watts, T H

    1998-01-10

    HLA-DM facilitates peptide acquisition by MHC class II proteins within the endosomes of APC by facilitating release of invariant chain peptide intermediates (CLIP) from the class II molecules. T2 cells have a deletion in the MHC II region which deletes HLA-DM and MHC II genes. T2 cells transfected with MHC class II proteins are defective in protein presentation, a defect that is corrected by HLA-DM transfection. Here we show that T2 cells transfected with Ak are also impaired in binding and presentation of the superantistaphylococcal enterotoxin A and that HLA-DM transfection corrects this defect. The poor ability of SEA to bind to Ak on DM-deficient cells is somewhat surprising since Ak has a low affinity for CLIP and is not predominantly occupied with CLIP on T2 cells compared to wide-type APC. These data suggest an influence of HLA-DM on the structure or composition of the Ak/peptide complex beyond its role in the release of invariant chain peptides.

  2. Assessing the performance of satellite-based precipitation products over the Mediterranean region

    Science.gov (United States)

    Xaver, Angelika; Dorigo, Wouter; Brocca, Luca; Ciabatta, Luca

    2017-04-01

    Detailed knowledge about the spatial and temporal patterns and quantities of precipitation is of high importance. This applies especially in the Mediterranean region, where water demand for agricultural, industrial and touristic needs is growing and climate projections foresee a decrease of precipitation amounts and an increase in variability. In this region, ground-based rain gauges are available only limited in number, particularly in northern Africa and the Middle East and lack to capture the high spatio-temporal character of precipitation over large areas. This has motivated the development of a large number of remote sensing products for monitoring rainfall. Satellite-based precipitation products are based on various observation principles and retrieval approaches, i.e. from thermal infra-red and microwaves. Although, many individual validation studies on the performance of these precipitation datasets exist, they mostly examine only one or a few of these rainfall products at the same time and are not targeted at the Mediterranean basin as a whole. Here, we present an extensive comparative study of seven different satellite-based precipitation products, namely CMORPH 30-minutes, CMORPH 3-hourly, GPCP, PERSIANN, SM2Rain CCI, TRMM TMPA 3B42, and TRMM TMPA 3B42RT, focusing on the whole Mediterranean region and on individual Mediterranean catchments. The time frame of investigation is restricted by the common availability of all precipitation products and covers the period 2000-2013. We assess the skill of the satellite products against gridded gauge-based data provided by GPCC and E-OBS. Apart from common characteristics like biases and temporal correlations we evaluate several sophisticated dataset properties that are of particular interest for Mediterranean hydrology, including the capability of the remotely sensed products to capture extreme events and trends. A clear seasonal dependency of the correlation results can be observed for the whole Mediterranean

  3. Examining the utility of satellite-based wind sheltering estimates for lake hydrodynamic modeling

    Science.gov (United States)

    Van Den Hoek, Jamon; Read, Jordan S.; Winslow, Luke A.; Montesano, Paul; Markfort, Corey D.

    2015-01-01

    Satellite-based measurements of vegetation canopy structure have been in common use for the last decade but have never been used to estimate canopy's impact on wind sheltering of individual lakes. Wind sheltering is caused by slower winds in the wake of topography and shoreline obstacles (e.g. forest canopy) and influences heat loss and the flux of wind-driven mixing energy into lakes, which control lake temperatures and indirectly structure lake ecosystem processes, including carbon cycling and thermal habitat partitioning. Lakeshore wind sheltering has often been parameterized by lake surface area but such empirical relationships are only based on forested lakeshores and overlook the contributions of local land cover and terrain to wind sheltering. This study is the first to examine the utility of satellite imagery-derived broad-scale estimates of wind sheltering across a diversity of land covers. Using 30 m spatial resolution ASTER GDEM2 elevation data, the mean sheltering height, hs, being the combination of local topographic rise and canopy height above the lake surface, is calculated within 100 m-wide buffers surrounding 76,000 lakes in the U.S. state of Wisconsin. Uncertainty of GDEM2-derived hs was compared to SRTM-, high-resolution G-LiHT lidar-, and ICESat-derived estimates of hs, respective influences of land cover type and buffer width on hsare examined; and the effect of including satellite-based hs on the accuracy of a statewide lake hydrodynamic model was discussed. Though GDEM2 hs uncertainty was comparable to or better than other satellite-based measures of hs, its higher spatial resolution and broader spatial coverage allowed more lakes to be included in modeling efforts. GDEM2 was shown to offer superior utility for estimating hs compared to other satellite-derived data, but was limited by its consistent underestimation of hs, inability to detect within-buffer hs variability, and differing accuracy across land cover types. Nonetheless

  4. Quantitative evaluation of automated skull-stripping methods applied to contemporary and legacy images: effects of diagnosis, bias correction, and slice location

    DEFF Research Database (Denmark)

    Fennema-Notestine, Christine; Ozyurt, I Burak; Clark, Camellia P

    2006-01-01

    Performance of automated methods to isolate brain from nonbrain tissues in magnetic resonance (MR) structural images may be influenced by MR signal inhomogeneities, type of MR image set, regional anatomy, and age and diagnosis of subjects studied. The present study compared the performance of four...... methods: Brain Extraction Tool (BET; Smith [2002]: Hum Brain Mapp 17:143-155); 3dIntracranial (Ward [1999] Milwaukee: Biophysics Research Institute, Medical College of Wisconsin; in AFNI); a Hybrid Watershed algorithm (HWA, Segonne et al. [2004] Neuroimage 22:1060-1075; in FreeSurfer); and Brain Surface...... Extractor (BSE, Sandor and Leahy [1997] IEEE Trans Med Imag 16:41-54; Shattuck et al. [2001] Neuroimage 13:856-876) to manually stripped images. The methods were applied to uncorrected and bias-corrected datasets; Legacy and Contemporary T1-weighted image sets; and four diagnostic groups (depressed...

  5. Engineering satellite-based navigation and timing global navigation satellite systems, signals, and receivers

    CERN Document Server

    Betz, J

    2016-01-01

    This book describes the design and performance analysis of satnav systems, signals, and receivers. It also provides succinct descriptions and comparisons of all the world’s satnav systems. Its comprehensive and logical structure addresses all satnav signals and systems in operation and being developed. Engineering Satellite-Based Navigation and Timing: Global Navigation Satellite Systems, Signals, and Receivers provides the technical foundation for designing and analyzing satnav signals, systems, and receivers. Its contents and structure address all satnav systems and signals: legacy, modernized, and new. It combines qualitative information with detailed techniques and analyses, providing a comprehensive set of insights and engineering tools for this complex multidisciplinary field. Part I describes system and signal engineering including orbital mechanics and constellation design, signal design principles and underlying considerations, link budgets, qua tifying receiver performance in interference, and e...

  6. Ground-and satellite-based evidence of the biophysical mechanisms behind the greening Sahel

    DEFF Research Database (Denmark)

    Brandt, Martin Stefan; Mbow, Cheikh; Diouf, Abdoul A.

    2015-01-01

    remain speculative. Our aim is to bridge these gaps and give specifics on the biophysical background factors of the re-greening Sahel. Therefore, a trend analysis was applied on long time series (1987-2013) of satellite-based vegetation and rainfall data, as well as on ground-observations of leaf biomass...... of woody species, herb biomass, and woody species abundance in different ecosystems located in the Sahel zone of Senegal. We found that the positive trend observed in satellite vegetation time series (+36%) is caused by an increment of in situ measured biomass (+34%), which is highly controlled...... conclude that the observed greening in the Senegalese Sahel is primarily related to an increasing tree cover that caused satellite-driven vegetation indices to increase with rainfall reversal. Copyright...

  7. Bias adjustment of satellite-based precipitation estimation using gauge observations: A case study in Chile

    Science.gov (United States)

    Yang, Zhongwen; Hsu, Kuolin; Sorooshian, Soroosh; Xu, Xinyi; Braithwaite, Dan; Verbist, Koen M. J.

    2016-04-01

    Satellite-based precipitation estimates (SPEs) are promising alternative precipitation data for climatic and hydrological applications, especially for regions where ground-based observations are limited. However, existing satellite-based rainfall estimations are subject to systematic biases. This study aims to adjust the biases in the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Cloud Classification System (PERSIANN-CCS) rainfall data over Chile, using gauge observations as reference. A novel bias adjustment framework, termed QM-GW, is proposed based on the nonparametric quantile mapping approach and a Gaussian weighting interpolation scheme. The PERSIANN-CCS precipitation estimates (daily, 0.04°×0.04°) over Chile are adjusted for the period of 2009-2014. The historical data (satellite and gauge) for 2009-2013 are used to calibrate the methodology; nonparametric cumulative distribution functions of satellite and gauge observations are estimated at every 1°×1° box region. One year (2014) of gauge data was used for validation. The results show that the biases of the PERSIANN-CCS precipitation data are effectively reduced. The spatial patterns of adjusted satellite rainfall show high consistency to the gauge observations, with reduced root-mean-square errors and mean biases. The systematic biases of the PERSIANN-CCS precipitation time series, at both monthly and daily scales, are removed. The extended validation also verifies that the proposed approach can be applied to adjust SPEs into the future, without further need for ground-based measurements. This study serves as a valuable reference for the bias adjustment of existing SPEs using gauge observations worldwide.

  8. Advancing land surface model development with satellite-based Earth observations

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Trigo, Isabel F.; Balsamo, Gianpaolo

    2017-04-01

    The land surface forms an essential part of the climate system. It interacts with the atmosphere through the exchange of water and energy and hence influences weather and climate, as well as their predictability. Correspondingly, the land surface model (LSM) is an essential part of any weather forecasting system. LSMs rely on partly poorly constrained parameters, due to sparse land surface observations. With the use of newly available land surface temperature observations, we show in this study that novel satellite-derived datasets help to improve LSM configuration, and hence can contribute to improved weather predictability. We use the Hydrology Tiled ECMWF Scheme of Surface Exchanges over Land (HTESSEL) and validate it comprehensively against an array of Earth observation reference datasets, including the new land surface temperature product. This reveals satisfactory model performance in terms of hydrology, but poor performance in terms of land surface temperature. This is due to inconsistencies of process representations in the model as identified from an analysis of perturbed parameter simulations. We show that HTESSEL can be more robustly calibrated with multiple instead of single reference datasets as this mitigates the impact of the structural inconsistencies. Finally, performing coupled global weather forecasts we find that a more robust calibration of HTESSEL also contributes to improved weather forecast skills. In summary, new satellite-based Earth observations are shown to enhance the multi-dataset calibration of LSMs, thereby improving the representation of insufficiently captured processes, advancing weather predictability and understanding of climate system feedbacks. Orth, R., E. Dutra, I. F. Trigo, and G. Balsamo (2016): Advancing land surface model development with satellite-based Earth observations. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-628

  9. An intercomparison and validation of satellite-based surface radiative energy flux estimates over the Arctic

    Science.gov (United States)

    Riihelä, Aku; Key, Jeffrey R.; Meirink, Jan Fokke; Kuipers Munneke, Peter; Palo, Timo; Karlsson, Karl-Göran

    2017-05-01

    Accurate determination of radiative energy fluxes over the Arctic is of crucial importance for understanding atmosphere-surface interactions, melt and refreezing cycles of the snow and ice cover, and the role of the Arctic in the global energy budget. Satellite-based estimates can provide comprehensive spatiotemporal coverage, but the accuracy and comparability of the existing data sets must be ascertained to facilitate their use. Here we compare radiative flux estimates from Clouds and the Earth's Radiant Energy System (CERES) Synoptic 1-degree (SYN1deg)/Energy Balanced and Filled, Global Energy and Water Cycle Experiment (GEWEX) surface energy budget, and our own experimental FluxNet / Satellite Application Facility on Climate Monitoring cLoud, Albedo and RAdiation (CLARA) data against in situ observations over Arctic sea ice and the Greenland Ice Sheet during summer of 2007. In general, CERES SYN1deg flux estimates agree best with in situ measurements, although with two particular limitations: (1) over sea ice the upwelling shortwave flux in CERES SYN1deg appears to be underestimated because of an underestimated surface albedo and (2) the CERES SYN1deg upwelling longwave flux over sea ice saturates during midsummer. The Advanced Very High Resolution Radiometer-based GEWEX and FluxNet-CLARA flux estimates generally show a larger range in retrieval errors relative to CERES, with contrasting tendencies relative to each other. The largest source of retrieval error in the FluxNet-CLARA downwelling shortwave flux is shown to be an overestimated cloud optical thickness. The results illustrate that satellite-based flux estimates over the Arctic are not yet homogeneous and that further efforts are necessary to investigate the differences in the surface and cloud properties which lead to disagreements in flux retrievals.

  10. Development of a combined SEM and ICP-MS approach for the qualitative and quantitative analyses of metal nano and microparticles in food products [corrected].

    Science.gov (United States)

    Beltrami, D; Calestani, D; Maffini, M; Suman, M; Melegari, B; Zappettini, A; Zanotti, L; Casellato, U; Careri, M; Mangia, A

    2011-09-01

    An integrated approach based on the use of inductively coupled plasma mass spectrometry (ICP-MS) and scanning electron microscopy (SEM) for the qualitative and quantitative analyses of metal particles in foods was devised and validated. Different raw materials and food products, like wheat, durum wheat, wheat flour, semolina, cookies, and pasta were considered. Attention was paid to the development of sample treatment protocols for each type of sample to avoid potential artifacts such as aggregation or agglomeration. The analytical protocols developed followed by ICP-MS and SEM investigations allowed us the quantitative determination and the morphological and dimensional characterization of metal nano- and microparticles isolated from the raw materials and finished food products considered. The ICP-MS method was validated in terms of linearity (0.8-80 μg/g and 0.09-9 μg/g for Fe and Ti, respectively), quantification limits (0.73 μg/g for Fe and 0.09 μg/g for Ti), repeatability (relative standard deviation (RSD) % equal to 10% for Fe and 20% in a wheat matrix as an example), and extraction recoveries (93 ± 2-101 ± 2%). Validation of the scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDS) measurements was performed working in a dimensional range from 1 to 100 μm with an estimated error in the size determination equal to 0.5 μm. ICP-MS data as well as SEM measurements showed a decrease in the concentration of metal particles from wheat to flour and from durum wheat to semolina samples, thus indicating an external contamination of grains by metal particles. These findings were confirmed by environmental SEM analysis, which allowed investigation of particles of lower dimensions. Generally, the largest number of particles was found in the case of iron and titanium, whereas particles of copper and zinc were only occasionally found without any possibility of quantifying their number.

  11. Application of Fuzzy Set Theory to Quantitative Analysis of Correctness of the Mathematical Model Based on the ADI Method during Solidification

    Directory of Open Access Journals (Sweden)

    Xiaofeng Niu

    2013-01-01

    Full Text Available The explicit finite difference (EFD method is used to calculate the casting temperature field during the solidification process. Because of its limited time step, the computational efficiency of the EFD method is lower than that of the alternating direction implicit (ADI method. A model based on the equivalent specific heat method and the ADI method that improves computational efficiency is established. The error of temperature field simulation comes from model simplification, the acceptable hypotheses and calculation errors caused by different time steps, and the different mesh numbers that are involved in the process of numerical simulation. This paper quantitatively analyzes the degree of similarity between simulated and experimental results by the hamming distance (HD. For a thick-walled position, the time step influences the simulation results of the temperature field and the number of casting meshes has little influence on the simulation results of temperature field. For a thin-walled position, the time step has minimal influence on the simulation results of the temperature field and the number of casting meshes has a larger influence on the simulation results of temperature field.

  12. GIO-EMS and International Collaboration in Satellite based Emergency Mapping

    Science.gov (United States)

    Kucera, Jan; Lemoine, Guido; Broglia, Marco

    2013-04-01

    During the last decade, satellite based emergency mapping has developed into a mature operational stage. The European Union's GMES Initial Operations - Emergency Management Service (GIO-EMS), is operational since April 2012. It's set up differs from other mechanisms (for example from the International Charter "Space and Major Disasters"), as it extends fast satellite tasking and delivery with the value adding map production as a single service, which is available, free of charge, to the authorized users of the service. Maps and vector datasets with standard characteristics and formats ranging from post-disaster damage assessment to recovery and disaster prevention are covered by this initiative. Main users of the service are European civil protection authorities and international organizations active in humanitarian aid. All non-sensitive outputs of the service are accessible to the public. The European Commission's in-house science service Joint Research Centre (JRC) is the technical and administrative supervisor of the GIO-EMS. The EC's DG ECHO Monitoring and Information Centre acts as the service's focal point and DG ENTR is responsible for overall service governance. GIO-EMS also aims to contribute to the synergy with similar existing mechanisms at national and international level. The usage of satellite data for emergency mapping has increased during the last years and this trend is expected to continue because of easier accessibility to suitable satellite and other relevant data in the near future. Furthermore, the data and analyses coming from volunteer emergency mapping communities are expected to further enrich the content of such cartographic products. In the case of major disasters the parallel activity of more providers is likely to generate non-optimal use of resources, e.g. unnecessary duplication; whereas coordination may lead to reduced time needed to cover the disaster area. Furthermore the abundant number of geospatial products of different

  13. The Satellite based Monitoring Initiative for Regional Air quality (SAMIRA): Project summary and first results

    Science.gov (United States)

    Schneider, Philipp; Stebel, Kerstin; Ajtai, Nicolae; Diamandi, Andrei; Horalek, Jan; Nemuc, Anca; Stachlewska, Iwona; Zehner, Claus

    2017-04-01

    We present a summary and some first results of a new ESA-funded project entitled Satellite based Monitoring Initiative for Regional Air quality (SAMIRA), which aims at improving regional and local air quality monitoring through synergetic use of data from present and upcoming satellite instruments, traditionally used in situ air quality monitoring networks and output from chemical transport models. Through collaborative efforts in four countries, namely Romania, Poland, the Czech Republic and Norway, all with existing air quality problems, SAMIRA intends to support the involved institutions and associated users in their national monitoring and reporting mandates as well as to generate novel research in this area. The primary goal of SAMIRA is to demonstrate the usefulness of existing and future satellite products of air quality for improving monitoring and mapping of air pollution at the regional scale. A total of six core activities are being carried out in order to achieve this goal: Firstly, the project is developing and optimizing algorithms for the retrieval of hourly aerosol optical depth (AOD) maps from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) onboard of Meteosat Second Generation. As a second activity, SAMIRA aims to derive particulate matter (PM2.5) estimates from AOD data by developing robust algorithms for AOD-to-PM conversion with the support from model- and Lidar data. In a third activity, we evaluate the added value of satellite products of atmospheric composition for operational European-scale air quality mapping using geostatistics and auxiliary datasets. The additional benefit of satellite-based monitoring over existing monitoring techniques (in situ, models) is tested by combining these datasets using geostatistical methods and demonstrated for nitrogen dioxide (NO2), sulphur dioxide (SO2), and aerosol optical depth/particulate matter. As a fourth activity, the project is developing novel algorithms for downscaling coarse

  14. Impacts of Satellite-Based Snow Albedo Assimilation on Offline and Coupled Land Surface Model Simulations.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Seasonal snow cover in the Northern Hemisphere is the largest component of the terrestrial cryosphere and plays a major role in the climate system through strong positive feedbacks related to albedo. The snow-albedo feedback is invoked as an important cause for the polar amplification of ongoing and projected climate change, and its parameterization across models is an important source of uncertainty in climate simulations. Here, instead of developing a physical snow albedo scheme, we use a direct insertion approach to assimilate satellite-based surface albedo during the snow season (hereafter as snow albedo assimilation into the land surface model ORCHIDEE (ORganizing Carbon and Hydrology In Dynamic EcosystEms and assess the influences of such assimilation on offline and coupled simulations. Our results have shown that snow albedo assimilation in both ORCHIDEE and ORCHIDEE-LMDZ (a general circulation model of Laboratoire de Météorologie Dynamique improve the simulation accuracy of mean seasonal (October throughout May snow water equivalent over the region north of 40 degrees. The sensitivity of snow water equivalent to snow albedo assimilation is more pronounced in the coupled simulation than the offline simulation since the feedback of albedo on air temperature is allowed in ORCHIDEE-LMDZ. We have also shown that simulations of air temperature at 2 meters in ORCHIDEE-LMDZ due to snow albedo assimilation are significantly improved during the spring in particular over the eastern Siberia region. This is a result of the fact that high amounts of shortwave radiation during the spring can maximize its snow albedo feedback, which is also supported by the finding that the spatial sensitivity of temperature change to albedo change is much larger during the spring than during the autumn and winter. In addition, the radiative forcing at the top of the atmosphere induced by snow albedo assimilation during the spring is estimated to be -2.50 W m-2, the

  15. A Satellite-Based Model for Simulating Ecosystem Respiration in the Tibetan and Inner Mongolian Grasslands

    Directory of Open Access Journals (Sweden)

    Rong Ge

    2018-01-01

    Full Text Available It is important to accurately evaluate ecosystem respiration (RE in the alpine grasslands of the Tibetan Plateau and the temperate grasslands of the Inner Mongolian Plateau, as it serves as a sensitivity indicator of regional and global carbon cycles. Here, we combined flux measurements taken between 2003 and 2013 from 16 grassland sites across northern China and the corresponding MODIS land surface temperature (LST, enhanced vegetation index (EVI, and land surface water index (LSWI to build a satellite-based model to estimate RE at a regional scale. First, the dependencies of both spatial and temporal variations of RE on these biotic and climatic factors were examined explicitly. We found that plant productivity and moisture, but not temperature, can best explain the spatial pattern of RE in northern China’s grasslands; while temperature plays a major role in regulating the temporal variability of RE in the alpine grasslands, and moisture is equally as important as temperature in the temperate grasslands. However, the moisture effect on RE and the explicit representation of spatial variation process are often lacking in most of the existing satellite-based RE models. On this basis, we developed a model by comprehensively considering moisture, temperature, and productivity effects on both temporal and spatial processes of RE, and then, we evaluated the model performance. Our results showed that the model well explained the observed RE in both the alpine (R2 = 0.79, RMSE = 0.77 g C m−2 day−1 and temperate grasslands (R2 = 0.75, RMSE = 0.60 g C m−2 day−1. The inclusion of the LSWI as the water-limiting factor substantially improved the model performance in arid and semi-arid ecosystems, and the spatialized basal respiration rate as an indicator for spatial variation largely determined the regional pattern of RE. Finally, the model accurately reproduced the seasonal and inter-annual variations and spatial variability of RE, and it avoided

  16. Impacts of Satellite-Based Snow Albedo Assimilation on Offline and Coupled Land Surface Model Simulations.

    Science.gov (United States)

    Wang, Tao; Peng, Shushi; Krinner, Gerhard; Ryder, James; Li, Yue; Dantec-Nédélec, Sarah; Ottlé, Catherine

    2015-01-01

    Seasonal snow cover in the Northern Hemisphere is the largest component of the terrestrial cryosphere and plays a major role in the climate system through strong positive feedbacks related to albedo. The snow-albedo feedback is invoked as an important cause for the polar amplification of ongoing and projected climate change, and its parameterization across models is an important source of uncertainty in climate simulations. Here, instead of developing a physical snow albedo scheme, we use a direct insertion approach to assimilate satellite-based surface albedo during the snow season (hereafter as snow albedo assimilation) into the land surface model ORCHIDEE (ORganizing Carbon and Hydrology In Dynamic EcosystEms) and assess the influences of such assimilation on offline and coupled simulations. Our results have shown that snow albedo assimilation in both ORCHIDEE and ORCHIDEE-LMDZ (a general circulation model of Laboratoire de Météorologie Dynamique) improve the simulation accuracy of mean seasonal (October throughout May) snow water equivalent over the region north of 40 degrees. The sensitivity of snow water equivalent to snow albedo assimilation is more pronounced in the coupled simulation than the offline simulation since the feedback of albedo on air temperature is allowed in ORCHIDEE-LMDZ. We have also shown that simulations of air temperature at 2 meters in ORCHIDEE-LMDZ due to snow albedo assimilation are significantly improved during the spring in particular over the eastern Siberia region. This is a result of the fact that high amounts of shortwave radiation during the spring can maximize its snow albedo feedback, which is also supported by the finding that the spatial sensitivity of temperature change to albedo change is much larger during the spring than during the autumn and winter. In addition, the radiative forcing at the top of the atmosphere induced by snow albedo assimilation during the spring is estimated to be -2.50 W m-2, the magnitude of

  17. Efficient all-solid-state UV source for satellite-based lidar applications

    Science.gov (United States)

    Armstrong, Darrell J.; Smith, Arlee V.

    2003-12-01

    A satellite-based UV-DIAL measurement system would allow continuous global monitoring of ozone concentration in the upper atmosphere. However such systems remain difficult to implement because aerosol-scattering return signals for satellite-based lidars are very weak. A suitable system must produce high-energy UV pulses at multiple wavelengths with very high efficiency. For example, a nanosecond system operating at 10 Hz must generate approximately 1 J per pulse at 308-320 nm. An efficient space-qualified wavelength-agile system based on a single UV source that can meet this requirement is probably not available using current laser technology. As an alternative, we're pursuing a multi-source approach employing all-solid-state modules that individually generate 300-320 nm light with pulse energies in the range of 50-200 mJ, with transform-limited bandwidths and good beam quality. Pulses from the individual sources can be incoherently summed to obtain the required single-pulse energy. These sources use sum-frequency mixing of the 532 nm second harmonic of an Nd:YAG pump laser with 731-803 nm light derived from a recently-developed, state-of-the-art, nanosecond optical parametric oscillator. Two source configurations are under development, one using extra-cavity sum-frequency mixing, and the other intra-cavity sum-frequency mixing. In either configuration, we hope to obtain sum-frequency mixing efficiency approaching 60% by carefully matching the spatial and temporal properties of the laser and OPO pulses. This ideal balance of green and near-IR photons requires an injection-seeded Nd:YAG pump-laser with very high beam quality, and an OPO exhibiting unusually high conversion efficiency and exceptional signal beam quality. The OPO employs a singly-resonant high-Fresnel-number image-rotating self-injection-seeded nonplanar-ring cavity that achieves pump depletion > 65% and produces signal beams with M2 ~ 3 at pulse energies exceeding 50 mJ. Pump beam requirements can be

  18. Satellite based radar interferometry to estimate large-scale soil water depletion from clay shrinkage: possibilities and limitations

    NARCIS (Netherlands)

    Brake, te B.; Hanssen, R.F.; Ploeg, van der M.J.; Rooij, de G.H.

    2013-01-01

    Satellite-based radar interferometry is a technique capable of measuring small surface elevation changes at large scales and with a high resolution. In vadose zone hydrology, it has been recognized for a long time that surface elevation changes due to swell and shrinkage of clayey soils can serve as

  19. Providing satellite-based early warnings of fires to reduce fire flashovers on South Africa’s transmission lines

    CSIR Research Space (South Africa)

    Frost, PE

    2007-07-01

    Full Text Available The Advanced Fire Information System (AFIS) is the first near real time operational satellite-based fire monitoring system of its kind in Africa. The main aim of AFIS is to provide information regarding the prediction, detection and assessment...

  20. Temporal and spatial evaluation of satellite-based rainfall estimates across the complex topographical and climatic gradients of Chile

    Science.gov (United States)

    Zambrano-Bigiarini, Mauricio; Nauditt, Alexandra; Birkel, Christian; Verbist, Koen; Ribbe, Lars

    2017-03-01

    Accurate representation of the real spatio-temporal variability of catchment rainfall inputs is currently severely limited. Moreover, spatially interpolated catchment precipitation is subject to large uncertainties, particularly in developing countries and regions which are difficult to access. Recently, satellite-based rainfall estimates (SREs) provide an unprecedented opportunity for a wide range of hydrological applications, from water resources modelling to monitoring of extreme events such as droughts and floods.This study attempts to exhaustively evaluate - for the first time - the suitability of seven state-of-the-art SRE products (TMPA 3B42v7, CHIRPSv2, CMORPH, PERSIANN-CDR, PERSIAN-CCS-Adj, MSWEPv1.1, and PGFv3) over the complex topography and diverse climatic gradients of Chile. Different temporal scales (daily, monthly, seasonal, annual) are used in a point-to-pixel comparison between precipitation time series measured at 366 stations (from sea level to 4600 m a.s.l. in the Andean Plateau) and the corresponding grid cell of each SRE (rescaled to a 0.25° grid if necessary). The modified Kling-Gupta efficiency was used to identify possible sources of systematic errors in each SRE. In addition, five categorical indices (PC, POD, FAR, ETS, fBIAS) were used to assess the ability of each SRE to correctly identify different precipitation intensities.Results revealed that most SRE products performed better for the humid South (36.4-43.7° S) and Central Chile (32.18-36.4° S), in particular at low- and mid-elevation zones (0-1000 m a.s.l.) compared to the arid northern regions and the Far South. Seasonally, all products performed best during the wet seasons (autumn and winter; MAM-JJA) compared to summer (DJF) and spring (SON). In addition, all SREs were able to correctly identify the occurrence of no-rain events, but they presented a low skill in classifying precipitation intensities during rainy days. Overall, PGFv3 exhibited the best performance everywhere

  1. Impact of ubiquitous inhibitors on the GUS gene reporter system: evidence from the model plants Arabidopsis, tobacco and rice and correction methods for quantitative assays of transgenic and endogenous GUS

    Directory of Open Access Journals (Sweden)

    Gerola Paolo D

    2009-12-01

    Full Text Available Abstract Background The β-glucuronidase (GUS gene reporter system is one of the most effective and employed techniques in the study of gene regulation in plant molecular biology. Improving protocols for GUS assays have rendered the original method described by Jefferson amenable to various requirements and conditions, but the serious limitation caused by inhibitors of the enzyme activity in plant tissues has thus far been underestimated. Results We report that inhibitors of GUS activity are ubiquitous in organ tissues of Arabidopsis, tobacco and rice, and significantly bias quantitative assessment of GUS activity in plant transformation experiments. Combined with previous literature reports on non-model species, our findings suggest that inhibitors may be common components of plant cells, with variable affinity towards the E. coli enzyme. The reduced inhibitory capacity towards the plant endogenous GUS discredits the hypothesis of a regulatory role of these compounds in plant cells, and their effect on the bacterial enzyme is better interpreted as a side effect due to their interaction with GUS during the assay. This is likely to have a bearing also on histochemical analyses, leading to inaccurate evaluations of GUS expression. Conclusions In order to achieve reliable results, inhibitor activity should be routinely tested during quantitative GUS assays. Two separate methods to correct the measured activity of the transgenic and endogenous GUS are presented.

  2. The satellite based augmentation system – EGNOS for non-precision approach global navigation satellite system

    Directory of Open Access Journals (Sweden)

    Andrzej FELLNER

    2012-01-01

    Full Text Available First in the Poland tests of the EGNOS SIS (Signal in Space were conducted on 5th October 2007 on the flight inspection with SPAN (The Synchronized Position Attitude Navigation technology at the Mielec airfield. This was an introduction to a test campaign of the EGNOS-based satellite navigation system for air traffic. The advanced studies will be performed within the framework of the EGNOS-APV project in 2011. The implementation of the EGNOS system to APV-I precision approach operations, is conducted according to ICAO requirements in Annex 10. Definition of usefulness and certification of EGNOS as SBAS (Satellite Based Augmentation System in aviation requires thorough analyses of accuracy, integrity, continuity and availability of SIS. Also, the project will try to exploit the excellent accuracy performance of EGNOS to analyze the implementation of GLS (GNSS Landing System approaches (Cat I-like approached using SBAS, with a decision height of 200 ft. Location of the EGNOS monitoring station Rzeszów, located near Polish-Ukrainian border, being also at the east border of planned EGNOS coverage for ECAC states is very useful for SIS tests in this area. According to current EGNOS programmed schedule, the project activities will be carried out with EGNOS system v2.2, which is the version released for civil aviation certification. Therefore, the project will allow demonstrating the feasibility of the EGNOS certifiable version for civil applications.

  3. Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration

    CERN Document Server

    Noureldin, Aboelmagd; Georgy, Jacques

    2013-01-01

    Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration is an introduction to the field of Integrated Navigation Systems. It serves as an excellent reference for working engineers as well as textbook for beginners and students new to the area. The book is easy to read and understand with minimum background knowledge. The authors explain the derivations in great detail. The intermediate steps are thoroughly explained so that a beginner can easily follow the material. The book shows a step-by-step implementation of navigation algorithms and provides all the necessary details. It provides detailed illustrations for an easy comprehension. The book also demonstrates real field experiments and in-vehicle road test results with professional discussions and analysis. This work is unique in discussing the different INS/GPS integration schemes in an easy to understand and straightforward way. Those schemes include loosely vs tightly coupled, open loop vs closed loop, and many more.

  4. Satellite-based forecasts for seeing and photometric quality at the European Extremely Large Telescope site

    Science.gov (United States)

    Cavazzani, S.; Ortolani, S.; Zitelli, V.

    2017-11-01

    In this article, we describe a new algorithm for short-time satellite-based forecasts for seeing and photometric quality at the European Extremely Large Telescope (E-ELT) site (Armazones) and we analyse the correlation between the Paranal and Armazones sites. The algorithm uses data from the polar satellite Aqua's Moderate Resolution Imaging Spectroradiometer (MODIS) and the Geostationary Operational Environmental Satellite (GOES 13). We have analysed 13 years (2003-2015) of cloud coverage data from MODIS in order to obtain the cyclical perturbations through Fourier analysis. Then we have developed the forecast model using GOES 13 d data (2015). Monthly calibration atmospheric-layer temperature thresholds have been achieved through the daily temperature range detected by the satellite. The algorithm works through conditional probability. This allowed us to extrapolate the main frequency of the cloud-coverage perturbations, achieving three results: there are two major seasonal meteorological frequencies at Armazones and a short one of 14 days. This result allows us to improve the rate of the prediction algorithm by introducing a new threshold function. The correlation of 98 per cent found between the pixel above Paranal and the pixel above Armazones allows us to use the Paranal ground data to validate the prediction model. We analysed the 2015 data at Armazones and reached a correlation of 97 per cent for the short-time photometry and seeing quality forecast.

  5. An Exploitation of Satellite-based Observation for Health Information: The UFOS Project

    Energy Technology Data Exchange (ETDEWEB)

    Mangin, A.; Morel, M.; Fanton d' Andon, O

    2000-07-01

    Short, medium and long-term trends of UV intensity levels are of crucial importance for either assessing effective biological impacts on human population, or implementing adequate preventive behaviours. Better information on a large spatial scale and increased public awareness of the short-term variations in UV values will help to support health agencies' goals of educating the public on UV risks. The Ultraviolet Forecast Operational Service Project (UFAS), financed in part by the European Commission/DG Information Society (TEN-TELECOM programme), aims to exploit satellite-based observations and to supply a set of UV products directly useful to health care. The short-term objective is to demonstrate the technical and economical feasibility and benefits that could be brought by such a system. UFOS is carried out by ACRI, with the support of an Advisory Group chaired by WHO and involving representation from the sectors of Health (WHO, INTERSUN collaborating centres, ZAMBON), Environment (WMO, IASB), and Telecommunications (EURECOM, IMET). (author)

  6. Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool

    Science.gov (United States)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  7. Development and validation of satellite-based estimates of surface visibility

    Science.gov (United States)

    Brunner, J.; Pierce, R. B.; Lenzen, A.

    2016-02-01

    A satellite-based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5 % for classifying clear (V ≥ 30 km), moderate (10 km ≤ V GOES-R ABI visibility retrieval can be used to augment measurements from the United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.

  8. Application of Satellite-Based Spectrally-Resolved Solar Radiation Data to PV Performance Studies

    Directory of Open Access Journals (Sweden)

    Ana Maria Gracia Amillo

    2015-04-01

    Full Text Available In recent years, satellite-based solar radiation data resolved in spectral bands have become available. This has for the first time made it possible to produce maps of the geographical variation in the solar spectrum. It also makes it possible to estimate the influence of these variations on the performance of photovoltaic (PV modules. Here, we present a study showing the magnitude of the spectral influence on PV performance over Europe and Africa. The method has been validated using measurements of a CdTe module in Ispra, Italy, showing that the method predicts the spectral influence to within ±2% on a monthly basis and 0.1% over a 19-month period. Application of the method to measured spectral responses of crystalline silicon, CdTe and single-junction amorphous silicon (a-Si modules shows that the spectral effect is smallest over desert areas for all module types, higher in temperate Europe and highest in tropical Africa, where CdTe modules would be expected to yield +6% and single- junction a-Si modules up to +10% more energy due to spectral effects. In contrast, the effect for crystalline silicon modules is less than ±1% in nearly all of Africa and Southern Europe, rising to +1% or +2% in Northern Europe.

  9. PlumeSat: A Micro-Satellite Based Plume Imagery Collection Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ledebuhr, A.G.; Ng, L.C.

    2002-06-30

    This paper describes a technical approach to cost-effectively collect plume imagery of boosting targets using a novel micro-satellite based platform operating in low earth orbit (LEO). The plume collection Micro-satellite or PlueSat for short, will be capable of carrying an array of multi-spectral (UV through LWIR) passive and active (Imaging LADAR) sensors and maneuvering with a lateral divert propulsion system to different observation altitudes (100 to 300 km) and different closing geometries to achieve a range of aspect angles (15 to 60 degrees) in order to simulate a variety of boost phase intercept missions. The PlumeSat will be a cost effective platform to collect boost phase plume imagery from within 1 to 10 km ranges, resulting in 0.1 to 1 meter resolution imagery of a variety of potential target missiles with a goal of demonstrating reliable plume-to-hardbody handover algorithms for future boost phase intercept missions. Once deployed on orbit, the PlumeSat would perform a series phenomenology collection experiments until expends its on-board propellants. The baseline PlumeSat concept is sized to provide from 5 to 7 separate fly by data collects of boosting targets. The total number of data collects will depend on the orbital basing altitude and the accuracy in delivering the boosting target vehicle to the nominal PlumeSat fly-by volume.

  10. Intrusion of coastal waters into the pelagic eastern Mediterranean: in situ and satellite-based characterization

    Directory of Open Access Journals (Sweden)

    S. Efrati

    2013-05-01

    Full Text Available A combined dataset of near-real-time multi-satellite observations and in situ measurements from a high-resolution survey is used for characterizing physical-biogeochemical properties of a patch stretching from the coast to the open sea in the Levantine Basin (LB of the eastern Mediterranean (EM. Spatial analysis of the combined dataset indicates that the patch is a semi-enclosed system, bounded within the mixed layer and separated from ambient waters by transport barriers induced by horizontal stirring. As such, the patch is characterized by physical-biogeochemical properties that significantly differ from those of the waters surrounding it, with lower salinity and higher temperatures, concentrations of silicic acid and chlorophyll a, and abundance of Synechococcus and picoeukaryote cells. Based on estimates of patch dimensions (∼40 km width and ∼25 m depth and propagation speed (∼0.09 m s−1, the volume flux associated with the patch is found to be on the order of 0.1 Sv. Our observations suggest that horizontal stirring by surface currents is likely to have an important impact on the ultra-oligotrophic Levantine Basin ecosystem, through (1 transport of nutrients and coastally derived material, and (2 formation of local, dynamically isolated niches. In addition, this work provides a satellite-based framework for planning and executing high-resolution sampling strategies in the interface between the coast and the open sea.

  11. Characterization of absorbing aerosol types using ground and satellites based observations over an urban environment

    Science.gov (United States)

    Bibi, Samina; Alam, Khan; Chishtie, Farrukh; Bibi, Humera

    2017-02-01

    In this paper, for the first time, an effort has been made to seasonally characterize the absorbing aerosols into different types using ground and satellite based observations. For this purpose, optical properties of aerosol retrieved from AErosol RObotic NETwork (AERONET) and Ozone Monitoring Instrument (OMI) were utilized over Karachi for the period 2012 to 2014. Firstly, OMI AODabs was validated with AERONET AODabs and found to have a high degree of correlation. Then, based on this validation, characterization was conducted by analyzing aerosol Fine Mode Fraction (FMF), Angstrom Exponent (AE), Absorption Angstrom Exponent (AAE), Single Scattering Albedo (SSA) and Aerosol Index (AI) and their mutual correlation, to identify the absorbing aerosol types and also to examine the variability in seasonal distribution. The absorbing aerosols were characterized into Mostly Black Carbon (BC), Mostly Dust and Mixed BC & Dust. The results revealed that Mostly BC aerosols contributed dominantly during winter and postmonsoon whereas, Mostly Dust were dominant during summer and premonsoon. These types of absorbing aerosol were also confirmed with MODerate resolution Imaging Spectroradiometer (MODIS) and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) observations.

  12. Towards the Development of a Global, Satellite-based, Terrestrial Snow Mission Planning Tool

    Science.gov (United States)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASAs Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical orbital configuration.One objective the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include:1. What observational records are needed (in space and time) to maximize terrestrial snow experimental utility?2. How might observations be coordinated (in space and time) to maximize utility? 3. What is the additional utility associated with an additional observation?4. How can future mission costs being minimized while ensuring Science requirements are fulfilled?

  13. Use of Satellite-based Remote Sensing to inform Evapotranspiration parameters in Cropping System Models

    Science.gov (United States)

    Dhungel, S.; Barber, M. E.

    2016-12-01

    The objectives of this paper are to use an automated satellite-based remote sensing evapotranspiration (ET) model to assist in parameterization of a cropping system model (CropSyst) and to examine the variability of consumptive water use of various crops across the watershed. The remote sensing model is a modified version of the Mapping Evapotranspiration at high Resolution with Internalized Calibration (METRIC™) energy balance model. We present the application of an automated python-based implementation of METRIC to estimate ET as consumptive water use for agricultural areas in three watersheds in Eastern Washington - Walla Walla, Lower Yakima and Okanogan. We used these ET maps with USDA crop data to identify the variability of crop growth and water use for the major crops in these three watersheds. Some crops, such as grapes and alfalfa, showed high variability in water use in the watershed while others, such as corn, had comparatively less variability. The results helped us to estimate the range and variability of various crop parameters that are used in CropSyst. The paper also presents a systematic approach to estimate parameters of CropSyst for a crop in a watershed using METRIC results. Our initial application of this approach was used to estimate irrigation application rate for CropSyst for a selected farm in Walla Walla and was validated by comparing crop growth (as Leaf Area Index - LAI) and consumptive water use (ET) from METRIC and CropSyst. This coupling of METRIC with CropSyst will allow for more robust parameters in CropSyst and will enable accurate predictions of changes in irrigation practices and crop rotation, which are a challenge in many cropping system models.

  14. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    Science.gov (United States)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  15. Long-term change analysis of satellite-based evapotranspiration over Indian vegetated surface

    Science.gov (United States)

    Gupta, Shweta; Bhattacharya, Bimal K.; Krishna, Akhouri P.

    2016-05-01

    In the present study, trend of satellite based annual evapotranspiration (ET) and natural forcing factors responsible for this were analyzed. Thirty years (1981-2010) of ET data at 0.08° grid resolution, generated over Indian region from opticalthermal observations from NOAA PAL and MODIS AQUA satellites, were used. Long-term data on gridded (0.5° x 0.5°) annual rainfall (RF), annual mean surface soil moisture (SSM) ERS scatterometer at 25 km resolution and annual mean incoming shortwave radiation from MERRA-2D reanalysis were also analyzed. Mann-Kendall tests were performed with time series data for trend analysis. Mean annual ET loss from Indian ago-ecosystem was found to be almost double (1100 Cubic Km) than Indian forest ecosystem (550 Cubic Km). Rainfed vegetation systems such as forest, rainfed cropland, grassland showed declining ET trend @ - 4.8, -0.6 &-0.4 Cubic Kmyr-1, respectively during 30 years. Irrigated cropland initially showed ET decline upto 1995 @ -0.8 cubic Kmyr-1 which could possibly be due to solar dimming followed by increasing ET @ 0.9 cubic Kmyr-1 after 1995. A cross-over point was detected between forest ET decline and ET increase in irrigated cropland during 2008. During 2001-2010, the four agriculturally important Indian states eastern, central, western and southern showed significantly increasing ET trend with S-score of 15-25 and Z-score of 1.09-2.9. Increasing ET in western and southern states was found to be coupled with increase in annual rainfall and SSM. But in eastern and central states no significant trend in rainfall was observed though significant increase in ET was noticed. The study recommended to investigate the influence of anthropogenic factors such as increase in area under irrigation, increased use of water for irrigation through ground water pumping, change in cropping pattern and cultivars on increasing ET.

  16. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S. W.; Yuan, M.; Cerveny, R. S.; Giri, C.

    2008-07-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  17. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  18. Evaluating Texas NOx emissions using satellite-based observations and model simulations

    Science.gov (United States)

    Frost, G. J.; Kim, S.; McKeen, S.; Cooper, O.; Hsie, E.; Trainer, M.; Heckel, A.; Richter, A.; Burrows, J.; Gleason, J.

    2008-12-01

    Anthropogenic NOx is produced primarily from fossil fuel combustion by motor vehicles, power generation, and industrial processes. Satellite-based measurements have been used to assess NOx emission trends on regional to global spatial scales and daily to annual temporal scales. The small horizontal footprints of current satellite-borne instruments, including SCIAMACHY and OMI, can be used to detect NO2 resulting from NOx emitted by isolated point sources and metropolitan areas in the western US. In this study we examine NOx emissions in the state of Texas by comparing NO2 vertical columns retrieved from these satellite instruments to those predicted by a regional chemical-transport model. Comparisons of satellite-derived and model- calculated NO2 columns over US power plants, where in-stack emission monitoring is carried out, enables a critical evaluation of the key parameters leading to uncertainties in the satellite and model data products. By using the satellite retrieval algorithms and model configurations that produce the best agreement in NO2 columns over power plants in northeastern Texas and elsewhere in the western US, satellite-model comparisons of NO2 columns over Texas cities in turn allow urban NOx emission inventories to be assessed. This work focuses on two large Texas metropolitan areas: Dallas/Fort Worth, where NOx is emitted predominantly by mobile and area-wide sources; and Houston, which, like Dallas, has typical urban sources, but also contains large industrial point sources emitting significant amounts of NOx. Year-to-year and day-of- week changes in the satellite data are used to infer NOx emission trends from point and mobile sources and to evaluate the effectiveness of NOx controls on some of these sources.

  19. Characterization of Satellite-Based Carbon Monoxide Surface Retrievals from MOPITT

    Science.gov (United States)

    Martinez-Alonso, S.; Deeter, M. N.; Barré, J.; Worden, H. M.

    2016-12-01

    Terra-MOPITT CO retrievals are routinely validated using airborne and satellite data. While MOPITT's performance in the mid- and upper-troposphere is well understood, surface retrievals are still not fully characterized. CO sources are mostly at the surface; thus, understanding the accuracy and limitations of MOPITT and other satellite-based surface retrievals is key if they are to be used in air quality monitoring and climate studies. A previous comparison of MOPITT surface CO retrievals to true values (ground and airborne measurements) provided mixed results: biases between retrievals and measurements varied greatly from site to site. The low-density coverage (spatially and temporally) of the true datasets was insufficient to explain MOPITT's mixed performance.Here we present a complementary comparison between a CO dataset produced with the GEOS-5 model and the synthetic MOPITT dataset derived from it. Both describe tropospheric CO composition over the contiguous USA during 2006, at 6-hour and 0.5o resolution. We applied to them the analysis we formulated in our previous comparison. We estimated atmospheric surface layer thickness via the VCL (Vertical Correlation Length), derived from the statistics of vertical profiles acquired at each site of interest. We determined the vertical resolution at the surface of MOPITT by measuring the FWHM (Full Width at Half Maximum) of its surface averaging kernels. We hypothesize that surface CO can be resolved if FWHM≤VCL. The spatial and temporal distribution of resolvable sites were then mapped.This ideal framework allows us to investigate spatial and temporal patterns in surface CO bias and relate those to relevant parameters (e.g., surface thermal contrast, planetary boundary layer height, degrees of freedom for signal at the surface). The ultimate goal is to predict under what circumstances are MOPITT surface retrievals accurate and, conversely, to understand what physical factors can hinder the surface retrieval

  20. Global investigations of the satellite-based Fugro OmniSTAR HP service

    Science.gov (United States)

    Pflugmacher, Andreas; Heister, Hansbert; Heunecke, Otto

    2009-12-01

    OmniSTAR is one of the world's leading suppliers of satellite-based augmentation services for onshore and offshore GNSS applications. OmniSTAR currently offers three services: VBS, HP and XP. OmniSTAR VBS is the code-based service, suitable for sub-metre positioning accuracy. The HP and XP services provide sub-decimetre accuracy, with the HP service based on a precise differential methodology and the XP service uses precise absolute positioning. The sub-decimetre HP and XP services both have distinctive convergence behaviour, and the positioning task is essentially a time-dependent process during which the accuracy of the estimated coordinates continuously improves over time. To validate the capabilities of the OmniSTAR services, and in particular the HP (High Performance) service, globally distributed measurement campaigns were performed. The results of these investigations confirm that the HP service satisfies its high accuracy specification, but only after a sufficient initialisation phase. Two kinds of disturbances can handicap HP operation: lack of GNSS observations and outages of the augmentation signal. The most serious kind of disturbance is the former. Within a few seconds the achieved convergence level is completely lost. Outages in the reception of augmentation data merely affect the relevant period of the outage - the accuracy during the outage is degraded. Only longer interruptions lead to a loss of the HP solution. When HP convergence is lost, the HP process has to be re-initialized. If there are known points (so-called “seed points”) available, a shortened “kick-start”-initialization is possible. With the aid of seed points it only takes a few minutes to restore convergence.

  1. Air traffic management system design using satellite based geo-positioning and communications assets

    Science.gov (United States)

    Horkin, Phil

    1995-01-01

    The current FAA and ICAO FANS vision of Air Traffic Management will transition the functions of Communications, Navigation, and Surveillance to satellite based assets in the 21st century. Fundamental to widespread acceptance of this vision is a geo-positioning system that can provide worldwide access with best case differential GPS performance, but without the associated problems. A robust communications capability linking-up aircraft and towers to meet the voice and data requirements is also essential. The current GPS constellation does not provide continuous global coverage with a sufficient number of satellites to meet the precision landing requirements as set by the world community. Periodic loss of the minimum number of satellites in view creates an integrity problem, which prevents GPS from becoming the primary system for navigation. Furthermore, there is reluctance on the part of many countries to depend on assets like GPS and GLONASS which are controlled by military communities. This paper addresses these concerns and provides a system solving the key issues associated with navigation, automatic dependent surveillance, and flexible communications. It contains an independent GPS-like navigation system with 27 satellites providing global coverage with a minimum of six in view at all times. Robust communications is provided by a network of TDMA/FDMA communications payloads contained on these satellites. This network can support simultaneous communications for up to 30,000 links, nearly enough to simultaneously support three times the current global fleet of jumbo air passenger aircraft. All of the required hardware is directly traceable to existing designs.

  2. Comparison of Historical Satellite-Based Estimates of Solar Radiation Resources with Recent Rotating Shadowband Radiometer Measurements: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Myers, D. R.

    2009-03-01

    The availability of rotating shadow band radiometer measurement data at several new stations provides an opportunity to compare historical satellite-based estimates of solar resources with measurements. We compare mean monthly daily total (MMDT) solar radiation data from eight years of NSRDB and 22 years of NASA hourly global horizontal and direct beam solar estimates with measured data from three stations, collected after the end of the available resource estimates.

  3. On the ionospheric impact of recent storm events on satellite-based augmentation systems in middle and low-lattitude sectors

    Science.gov (United States)

    Komjathy, A.; Sparks, L.; Mannucci, A. J.; Pi, X.

    2003-01-01

    In this paper, we use GPS measurements of geomagnetic storm days to perform a quantitative assessment of WAAS-type ionospheric correction algorithms in other parts of the world such as the low-latitude Brazil and mid-latitude Europe.

  4. Utility and Value of Satellite-Based Frost Forecasting for Kenya's Tea Farming Sector

    Science.gov (United States)

    Morrison, I.

    2016-12-01

    Frost damage regularly inflicts millions of dollars of crop losses in the tea-growing highlands of western Kenya, a problem that the USAID/NASA Regional Visualization and Monitoring System (SERVIR) program is working to mitigate through a frost monitoring and forecasting product that uses satellite-based temperature and soil moisture data to generate up to three days of advanced warning before frost events. This paper presents the findings of a value of information (VOI) study assessing the value of this product based on Kenyan tea farmers' experiences with frost and frost-damage mitigation. Value was calculated based on historic trends of frost frequency, severity, and extent; likelihood of warning receipt and response; and subsequent frost-related crop-loss aversion. Quantification of these factors was derived through inferential analysis of survey data from 400 tea-farming households across the tea-growing regions of Kericho and Nandi, supplemented with key informant interviews with decision-makers at large estate tea plantations, historical frost incident and crop-loss data from estate tea plantations and agricultural insurance companies, and publicly available demographic and economic data. At this time, the product provides a forecasting window of up to three days, and no other frost-prediction methods are used by the large or small-scale farmers of Kenya's tea sector. This represents a significant opportunity for preemptive loss-reduction via Earth observation data. However, the tea-growing community has only two realistic options for frost-damage mitigation: preemptive harvest of available tea leaves to minimize losses, or skiving (light pruning) to facilitate fast recovery from frost damage. Both options are labor-intensive and require a minimum of three days of warning to be viable. As a result, the frost forecasting system has a very narrow margin of usefulness, making its value highly dependent on rapid access to the warning messages and flexible access

  5. Validity of Five Satellite-Based Latent Heat Flux Algorithms for Semi-arid Ecosystems

    Directory of Open Access Journals (Sweden)

    Fei Feng

    2015-12-01

    Full Text Available Accurate estimation of latent heat flux (LE is critical in characterizing semiarid ecosystems. Many LE algorithms have been developed during the past few decades. However, the algorithms have not been directly compared, particularly over global semiarid ecosystems. In this paper, we evaluated the performance of five LE models over semiarid ecosystems such as grassland, shrub, and savanna using the Fluxnet dataset of 68 eddy covariance (EC sites during the period 2000–2009. We also used a modern-era retrospective analysis for research and applications (MERRA dataset, the Normalized Difference Vegetation Index (NDVI and Fractional Photosynthetically Active Radiation (FPAR from the moderate resolution imaging spectroradiometer (MODIS products; the leaf area index (LAI from the global land surface satellite (GLASS products; and the digital elevation model (DEM from shuttle radar topography mission (SRTM30 dataset to generate LE at region scale during the period 2003–2006. The models were the moderate resolution imaging spectroradiometer LE (MOD16 algorithm, revised remote sensing based Penman–Monteith LE algorithm (RRS, the Priestley–Taylor LE algorithm of the Jet Propulsion Laboratory (PT-JPL, the modified satellite-based Priestley–Taylor LE algorithm (MS-PT, and the semi-empirical Penman LE algorithm (UMD. Direct comparison with ground measured LE showed the PT-JPL and MS-PT algorithms had relative high performance over semiarid ecosystems with the coefficient of determination (R2 ranging from 0.6 to 0.8 and root mean squared error (RMSE of approximately 20 W/m2. Empirical parameters in the structure algorithms of MOD16 and RRS, and calibrated coefficients of the UMD algorithm may be the cause of the reduced performance of these LE algorithms with R2 ranging from 0.5 to 0.7 and RMSE ranging from 20 to 35 W/m2 for MOD16, RRS and UMD. Sensitivity analysis showed that radiation and vegetation terms were the dominating variables affecting LE

  6. Multiple satellite-based analysis reveals complex climate effects of temperate forests and related energy budget

    Science.gov (United States)

    Ma, Wei; Jia, Gensuo; Zhang, Anzhi

    2017-04-01

    Forest conversion-driven biophysical processes have been examined in various case studies that largely depend on sensitivity analysis of climate modeling. However, much remains unknown in the real world due to the complicated process and uncertainty in magnitude, especially in the temperate bioclimate regions. This study applied satellite-based observation to investigate the biophysical climate response to potential forest conversion in China, especially on the spatial and temporal patterns and underlying mechanisms. We evaluated the differences of land surface temperature (ΔLST) between adjacent forest and cropland, in terms of the latitudinal and seasonal patterns. Compared to cropland, the temperate forest to the south of 40°N showed the cooling effect of -0.61 ± 0.02°C (95% confidence interval, and hereafter), and it presented the warming effect of 0.48 ± 0.06°C to the north of 48°N (the transition zone was between 40°N and 48°N). Seasonal analysis further demonstrated that the cooling effects of temperate forest in China in spring (March, April, May), summer (June, July, August), and autumn (September, October, November) were -0.53 ± 0.02°C, -0.55 ± 0.02°C, and -0.30 ± 0.02°C, respectively, while the forest caused the warming effect of 0.10 ± 0.04°C in winter (December, January, February). However, the biophysical climate response to forest conversion in temperate regions was complex and showed highly spatial and temporal heterogeneity. We further assessed the role of two major biophysical processes, i.e., albedo and evapotranspiration (ET), in shaping land surface temperature from surface energy budget perspective. Results showed that the latitudinal, seasonal, and spatiotemporal patterns of ΔLST was determined by the net effect of ET-induced latent heat changes and albedo-induced solar radiation absorption changes.

  7. The development of potassium tantalate niobate thin films for satellite-based pyroelectric detectors

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, Hilary B.B. [Univ. of California, Berkeley, CA (United States). Dept. of Materials Science and Mineral Engineering

    1997-05-01

    Potassium tantalate niobate (KTN) pyroelectric detectors are expected to provide detectivities, of 3.7 x 1011 cmHz 1/2W-1 for satellite-based infrared detection at 90 K. The background limited detectivity for a room-temperature thermal detector is 1.8 x 1010 cmHz1/2W-1 . KTN is a unique ferroelectric for this application because of the ability to tailor the temperature of its pyroelectric response by adjusting its ratio of tantalum to niobium. The ability to fabricate high quality KTN thin films on Si-based substrates is crucial to the development of KTN pyroelectric detectors. SixNymembranes created on the Si substrate will provide the weak thermal link necessary to reach background limited detectivities. The device dimensions obtainable by thin film processing are expected to increase the ferroelectric response by 20 times over bulk fabricated KTN detectors. In addition, microfabrication techniques allow for easier array development. This is the first reported attempt at growth of KTN films on Si-based substrates. Pure phase perovskite films were grown by pulsed laser deposition on SrRuO3/Pt/Ti/SixNy/Si and SrRuO3/SixNy/Si structures; room temperature dielectric permittivities for the KTN films were 290 and 2.5, respectively. The dielectric permittivity for bulk grown, single crystal KTN is ~380. In addition to depressed dielectric permittivities, no ferroelectric hysteresis was found between 80 and 300 K for either structure. RBS, AES, TEM and multi-frequency dielectric measurements were used to investigate the origin of this apparent lack of ferroelectricity. Other issues addressed by this dissertation include: the role of oxygen and target density during pulsed laser deposition of KTN thin films; the use of YBCO, LSC and Pt as direct contact bottom electrodes to the KTN films, and the adhesion of the bottom

  8. Satellite-based PM concentrations and their application to COPD in Cleveland, OH

    Science.gov (United States)

    Kumar, Naresh; Liang, Dong; Comellas, Alejandro; Chu, Allen D.; Abrams, Thad

    2014-01-01

    A hybrid approach is proposed to estimate exposure to fine particulate matter (PM2.5) at a given location and time. This approach builds on satellite-based aerosol optical depth (AOD), air pollution data from sparsely distributed Environmental Protection Agency (EPA) sites and local time–space Kriging, an optimal interpolation technique. Given the daily global coverage of AOD data, we can develop daily estimate of air quality at any given location and time. This can assure unprecedented spatial coverage, needed for air quality surveillance and management and epidemiological studies. In this paper, we developed an empirical relationship between the 2 km AOD and PM2.5 data from EPA sites. Extrapolating this relationship to the study domain resulted in 2.3 million predictions of PM2.5 between 2000 and 2009 in Cleveland Metropolitan Statistical Area (MSA). We have developed local time–space Kriging to compute exposure at a given location and time using the predicted PM2.5. Daily estimates of PM2.5 were developed for Cleveland MSA between 2000 and 2009 at 2.5 km spatial resolution; 1.7 million (~79.8%) of 2.13 million predictions required for multiyear and geographic domain were robust. In the epidemiological application of the hybrid approach, admissions for an acute exacerbation of chronic obstructive pulmonary disease (AECOPD) was examined with respect to time–space lagged PM2.5 exposure. Our analysis suggests that the risk of AECOPD increases 2.3% with a unit increase in PM2.5 exposure within 9 days and 0.05° (~5 km) distance lags. In the aggregated analysis, the exposed groups (who experienced exposure to PM2.5 >15.4 μg/m3) were 54% more likely to be admitted for AECOPD than the reference group. The hybrid approach offers greater spatiotemporal coverage and reliable characterization of ambient concentration than conventional in situ monitoring-based approaches. Thus, this approach can potentially reduce exposure misclassification errors in the conventional

  9. Satellite-Based Spatiotemporal Trends in PM2.5 Concentrations: China, 2004–2013

    Science.gov (United States)

    Ma, Zongwei; Hu, Xuefei; Sayer, Andrew M.; Levy, Robert; Zhang, Qiang; Xue, Yingang; Tong, Shilu; Bi, Jun; Huang, Lei; Liu, Yang

    2015-01-01

    L, Liu Y. 2016. Satellite-based spatiotemporal trends in PM2.5 concentrations: China, 2004–2013. Environ Health Perspect 124:184–192; http://dx.doi.org/10.1289/ehp.1409481 PMID:26220256

  10. Correct Models

    OpenAIRE

    Blacher, René

    2010-01-01

    Ce rapport complete les deux rapports précédents et apporte une explication plus simple aux résultats précédents : à savoir la preuve que les suites obtenues sont aléatoires.; In previous reports, we have show how to transform a text $y_n$ in a random sequence by using functions of Fibonacci $T_q$. Now, in this report, we obtain a clearer result by proving that $T_q(y_n)$ has the IID model as correct model. But, it is necessary to define correctly a correct model. Then, we study also this pro...

  11. Blending Model Output with satellite-based and in-situ observations to produce high-resolution estimates of population exposure to wildfire smoke

    Science.gov (United States)

    Lassman, William

    In the western US, emissions from wildfires and prescribed fire have been associated with degradation of regional air quality. Whereas atmospheric aerosol particles with aerodynamic diameters less than 2.5 mum (PM2.5) have known impacts on human health, there is uncertainty in how particle composition, concentrations, and exposure duration impact the associated health response. Due to changes in climate and land-management, wildfires have increased in frequency and severity, and this trend is expected to continue. Consequently, wildfires are expected to become an increasingly important source of PM2.5 in the western US. While composition and source of the aerosol is thought to be an important factor in the resulting human health-effects, this is currently not well-understood; therefore, there is a need to develop a quantitative understanding of wildfire-smoke-specific health effects. A necessary step in this process is to determine who was exposed to wildfire smoke, the concentration of the smoke during exposure, and the duration of the exposure. Three different tools are commonly used to assess exposure to wildfire smoke: in-situ measurements, satellite-based observations, and chemical-transport model (CTM) simulations, and each of these exposure-estimation tools have associated strengths and weakness. In this thesis, we investigate the utility of blending these tools together to produce highly accurate estimates of smoke exposure during the 2012 fire season in Washington for use in an epidemiological case study. For blending, we use a ridge regression model, as well as a geographically weighted ridge regression model. We evaluate the performance of the three individual exposure-estimate techniques and the two blended techniques using Leave-One-Out Cross-Validation. Due to the number of in-situ monitors present during this time period, we find that predictions based on in-situ monitors were more accurate for this particular fire season than the CTM simulations and

  12. Validation of satellite-based noontime UVI with NDACC ground-based instruments: influence of topography, environment and satellite overpass time

    Directory of Open Access Journals (Sweden)

    C. Brogniez

    2016-12-01

    Full Text Available Spectral solar UV radiation measurements are performed in France using three spectroradiometers located at very different sites. One is installed in Villeneuve d'Ascq, in the north of France (VDA. It is an urban site in a topographically flat region. Another instrument is installed in Observatoire de Haute-Provence, located in the southern French Alps (OHP. It is a rural mountainous site. The third instrument is installed in Saint-Denis, Réunion Island (SDR. It is a coastal urban site on a small mountainous island in the southern tropics. The three instruments are affiliated with the Network for the Detection of Atmospheric Composition Change (NDACC and carry out routine measurements to monitor the spectral solar UV radiation and enable derivation of UV index (UVI. The ground-based UVI values observed at solar noon are compared to similar quantities derived from the Ozone Monitoring Instrument (OMI, onboard the Aura satellite and the second Global Ozone Monitoring Experiment (GOME-2, onboard the Metop-A satellite measurements for validation of these satellite-based products. The present study concerns the period 2009–September 2012, date of the implementation of a new OMI processing tool. The new version (v1.3 introduces a correction for absorbing aerosols that were not considered in the old version (v1.2. Both versions of the OMI UVI products were available before September 2012 and are used to assess the improvement of the new processing tool. On average, estimates from satellite instruments always overestimate surface UVI at solar noon. Under cloudless conditions, the satellite-derived estimates of UVI compare satisfactorily with ground-based data: the median relative bias is less than 8 % at VDA and 4 % at SDR for both OMI v1.3 and GOME-2, and about 6 % for OMI v1.3 and 2 % for GOME-2 at OHP. The correlation between satellite-based and ground-based data is better at VDA and OHP (about 0.99 than at SDR (0.96 for both space

  13. Validation of satellite-based noontime UVI with NDACC ground-based instruments: influence of topography, environment and satellite overpass time

    Science.gov (United States)

    Brogniez, Colette; Auriol, Frédérique; Deroo, Christine; Arola, Antti; Kujanpää, Jukka; Sauvage, Béatrice; Kalakoski, Niilo; Riku Aleksi Pitkänen, Mikko; Catalfamo, Maxime; Metzger, Jean-Marc; Tournois, Guy; Da Conceicao, Pierre

    2016-12-01

    Spectral solar UV radiation measurements are performed in France using three spectroradiometers located at very different sites. One is installed in Villeneuve d'Ascq, in the north of France (VDA). It is an urban site in a topographically flat region. Another instrument is installed in Observatoire de Haute-Provence, located in the southern French Alps (OHP). It is a rural mountainous site. The third instrument is installed in Saint-Denis, Réunion Island (SDR). It is a coastal urban site on a small mountainous island in the southern tropics. The three instruments are affiliated with the Network for the Detection of Atmospheric Composition Change (NDACC) and carry out routine measurements to monitor the spectral solar UV radiation and enable derivation of UV index (UVI). The ground-based UVI values observed at solar noon are compared to similar quantities derived from the Ozone Monitoring Instrument (OMI, onboard the Aura satellite) and the second Global Ozone Monitoring Experiment (GOME-2, onboard the Metop-A satellite) measurements for validation of these satellite-based products. The present study concerns the period 2009-September 2012, date of the implementation of a new OMI processing tool. The new version (v1.3) introduces a correction for absorbing aerosols that were not considered in the old version (v1.2). Both versions of the OMI UVI products were available before September 2012 and are used to assess the improvement of the new processing tool. On average, estimates from satellite instruments always overestimate surface UVI at solar noon. Under cloudless conditions, the satellite-derived estimates of UVI compare satisfactorily with ground-based data: the median relative bias is less than 8 % at VDA and 4 % at SDR for both OMI v1.3 and GOME-2, and about 6 % for OMI v1.3 and 2 % for GOME-2 at OHP. The correlation between satellite-based and ground-based data is better at VDA and OHP (about 0.99) than at SDR (0.96) for both space-borne instruments. For all

  14. Japanese Global Precipitation Measurement (GPM) mission status and application of satellite-based global rainfall map

    Science.gov (United States)

    Kachi, Misako; Shimizu, Shuji; Kubota, Takuji; Yoshida, Naofumi; Oki, Riko; Kojima, Masahiro; Iguchi, Toshio; Nakamura, Kenji

    2010-05-01

    . Collaboration with GCOM-W is not only limited to its participation to GPM constellation but also coordination in areas of algorithm development and validation in Japan. Generation of high-temporal and high-accurate global rainfall map is one of targets of the GPM mission. As a proto-type for GPM era, JAXA has developed and operates the Global Precipitation Map algorithm in near-real-time since October 2008, and hourly and 0.1-degree resolution binary data and images available at http://sharaku.eorc.jaxa.jp/GSMaP/ four hours after observation. The algorithms are based on outcomes from the Global Satellite Mapping for Precipitation (GSMaP) project, which was sponsored by the Japan Science and Technology Agency (JST) under the Core Research for Evolutional Science and Technology (CREST) framework between 2002 and 2007 (Okamoto et al., 2005; Aonashi et al., 2009; Ushio et al., 2009). Target of GSMaP project is to produce global rainfall maps that are highly accurate and in high temporal and spatial resolution through the development of rain rate retrieval algorithms based on reliable precipitation physical models by using several microwave radiometer data, and comprehensive use of precipitation radar and geostationary infrared imager data. Near-real-time GSMaP data is distributed via internet and utilized by end users. Purpose of data utilization by each user covers broad areas and in world wide; Science researches (model validation, data assimilation, typhoon study, etc.), weather forecast/service, flood warning and rain analysis over river basin, oceanographic condition forecast, agriculture, and education. Toward the GPM era, operational application should be further emphasized as well as science application. JAXA continues collaboration with hydrological communities to utilize satellite-based precipitation data as inputs to future flood prediction and warning system, as well as with meteorological agencies to proceed further data utilization in numerical weather prediction

  15. GLEAM v3: satellite-based land evaporation and root-zone soil moisture

    Science.gov (United States)

    Martens, Brecht; Miralles, Diego G.; Lievens, Hans; van der Schalie, Robin; de Jeu, Richard A. M.; Fernández-Prieto, Diego; Beck, Hylke E.; Dorigo, Wouter A.; Verhoest, Niko E. C.

    2017-05-01

    The Global Land Evaporation Amsterdam Model (GLEAM) is a set of algorithms dedicated to the estimation of terrestrial evaporation and root-zone soil moisture from satellite data. Ever since its development in 2011, the model has been regularly revised, aiming at the optimal incorporation of new satellite-observed geophysical variables, and improving the representation of physical processes. In this study, the next version of this model (v3) is presented. Key changes relative to the previous version include (1) a revised formulation of the evaporative stress, (2) an optimized drainage algorithm, and (3) a new soil moisture data assimilation system. GLEAM v3 is used to produce three new data sets of terrestrial evaporation and root-zone soil moisture, including a 36-year data set spanning 1980-2015, referred to as v3a (based on satellite-observed soil moisture, vegetation optical depth and snow-water equivalent, reanalysis air temperature and radiation, and a multi-source precipitation product), and two satellite-based data sets. The latter share most of their forcing, except for the vegetation optical depth and soil moisture, which are based on observations from different passive and active C- and L-band microwave sensors (European Space Agency Climate Change Initiative, ESA CCI) for the v3b data set (spanning 2003-2015) and observations from the Soil Moisture and Ocean Salinity (SMOS) satellite in the v3c data set (spanning 2011-2015). Here, these three data sets are described in detail, compared against analogous data sets generated using the previous version of GLEAM (v2), and validated against measurements from 91 eddy-covariance towers and 2325 soil moisture sensors across a broad range of ecosystems. Results indicate that the quality of the v3 soil moisture is consistently better than the one from v2: average correlations against in situ surface soil moisture measurements increase from 0.61 to 0.64 in the case of the v3a data set and the representation of soil

  16. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25 x 0.25 deg, Monthly Grid V3 (GSSTFM) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  17. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25x0.25 deg, Daily Grid, V3, (GSSTF), at GES DISC V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 (GSSTF3) Dataset recently produced through a MEaSUREs funded project led by Dr....

  18. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25 x 0.25 deg, Daily Grid V3 (GSSTF) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 (GSSTF3) Dataset recently produced through a MEaSUREs funded project led by Dr....

  19. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25x0.25 deg, Monthly Grid, V3, (GSSTFM), at GES DISC V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  20. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25 x 0.25 deg, Daily Grid F15 V3 (GSSTF_F15) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  1. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25 x 0.25 deg, Daily Grid F13 V3 (GSSTF_F13) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  2. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25 x 0.25 deg, Daily Grid F08 V3 (GSSTF_F08) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  3. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25x0.25 deg, Daily Grid, V3, (GSSTF_F13) V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  4. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25x0.25 deg, Daily Grid, V3, (GSSTF_11) V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  5. Goddard Satellite-Based Surface Turbulent Fluxes, 0.25x0.25 deg, Daily Grid, V3, (GSSTF_F14) V3

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are part of the Goddard Satellite-based Surface Turbulent Fluxes Version 3 (GSSTF3) Dataset recently produced through a MEaSURES funded project led by Dr....

  6. Goddard Satellite-Based Surface Turbulent Fluxes Climatology, 0.25 x 0.25 deg, Seasonal Grid V3 (GSSTFSC) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — These data are the Goddard Satellite-based Surface Turbulent Fluxes Version-3 Dataset recently produced through a MEaSUREs funded project led by Dr. Chung-Lin Shie...

  7. Impact of spatial proxies on the representation of bottom-up emission inventories: A satellite-based analysis

    Science.gov (United States)

    Geng, Guannan; Zhang, Qiang; Martin, Randall V.; Lin, Jintai; Huo, Hong; Zheng, Bo; Wang, Siwen; He, Kebin

    2017-03-01

    Spatial proxies used in bottom-up emission inventories to derive the spatial distributions of emissions are usually empirical and involve additional levels of uncertainty. Although uncertainties in current emission inventories have been discussed extensively, uncertainties resulting from improper spatial proxies have rarely been evaluated. In this work, we investigate the impact of spatial proxies on the representation of gridded emissions by comparing six gridded NOx emission datasets over China developed from the same magnitude of emissions and different spatial proxies. GEOS-Chem-modeled tropospheric NO2 vertical columns simulated from different gridded emission inventories are compared with satellite-based columns. The results show that differences between modeled and satellite-based NO2 vertical columns are sensitive to the spatial proxies used in the gridded emission inventories. The total population density is less suitable for allocating NOx emissions than nighttime light data because population density tends to allocate more emissions to rural areas. Determining the exact locations of large emission sources could significantly strengthen the correlation between modeled and observed NO2 vertical columns. Using vehicle population and an updated road network for the on-road transport sector could substantially enhance urban emissions and improve the model performance. When further applying industrial gross domestic product (IGDP) values for the industrial sector, modeled NO2 vertical columns could better capture pollution hotspots in urban areas and exhibit the best performance of the six cases compared to satellite-based NO2 vertical columns (slope = 1.01 and R2 = 0. 85). This analysis provides a framework for information from satellite observations to inform bottom-up inventory development. In the future, more effort should be devoted to the representation of spatial proxies to improve spatial patterns in bottom-up emission inventories.

  8. Publisher Correction

    DEFF Research Database (Denmark)

    Flachsbart, Friederike; Dose, Janina; Gentschew, Liljana

    2018-01-01

    The original version of this Article contained an error in the spelling of the author Robert Häsler, which was incorrectly given as Robert Häesler. This has now been corrected in both the PDF and HTML versions of the Article....

  9. Strategies for satellite-based monitoring of CO2 from distributed area and point sources

    Science.gov (United States)

    Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David

    2014-05-01

    and sensor provides the full range of temporal sampling needed to characterize distributed area and point source emissions. For instance, point source emission patterns will vary with source strength, wind speed and direction. Because wind speed, direction and other environmental factors change rapidly, short term variabilities should be sampled. For detailed target selection and pointing verification, important lessons have already been learned and strategies devised during JAXA's GOSAT mission (Schwandner et al, 2013). The fact that competing spatial and temporal requirements drive satellite remote sensing sampling strategies dictates a systematic, multi-factor consideration of potential solutions. Factors to consider include vista, revisit frequency, integration times, spatial resolution, and spatial coverage. No single satellite-based remote sensing solution can address this problem for all scales. It is therefore of paramount importance for the international community to develop and maintain a constellation of atmospheric CO2 monitoring satellites that complement each other in their temporal and spatial observation capabilities: Polar sun-synchronous orbits (fixed local solar time, no diurnal information) with agile pointing allow global sampling of known distributed area and point sources like megacities, power plants and volcanoes with daily to weekly temporal revisits and moderate to high spatial resolution. Extensive targeting of distributed area and point sources comes at the expense of reduced mapping or spatial coverage, and the important contextual information that comes with large-scale contiguous spatial sampling. Polar sun-synchronous orbits with push-broom swath-mapping but limited pointing agility may allow mapping of individual source plumes and their spatial variability, but will depend on fortuitous environmental conditions during the observing period. These solutions typically have longer times between revisits, limiting their ability to resolve

  10. Air-sea fluxes and satellite-based estimation of water masses formation

    Science.gov (United States)

    Sabia, Roberto; Klockmann, Marlene; Fernandez-Prieto, Diego; Donlon, Craig

    2015-04-01

    Recent work linking satellite-based measurements of sea surface salinity (SSS) and sea surface temperature (SST) with traditional physical oceanography has demonstrated the capability of generating routinely satellite-derived surface T-S diagrams [1] and analyze the distribution/dynamics of SSS and its relative surface density with respect to in-situ measurements. Even more recently [2,3], this framework has been extended by exploiting these T-S diagrams as a diagnostic tool to derive water masses formation rates and areas. A water mass describes a water body with physical properties distinct from the surrounding water, formed at the ocean surface under specific conditions which determine its temperature and salinity. The SST and SSS (and thus also density) at the ocean surface are largely determined by fluxes of heat and freshwater. The surface density flux is a function of the latter two and describes the change of the density of seawater at the surface. To obtain observations of water mass formation is of great interest, since they serve as indirect observations of the thermo-haline circulation. The SSS data which has become available through the SMOS [4] and Aquarius [5] satellite missions will provide the possibility of studying also the effect of temporally-varying SSS fields on water mass formation. In the present study, the formation of water masses as a function of SST and SSS is derived from the surface density flux by integrating the latter over a specific area and time period in bins of SST and SSS and then taking the derivative of the total density flux with respect to density. This study presents a test case using SMOS SSS, OSTIA SST, as well as Argo ISAS SST and SSS for comparison, heat fluxes from the NOCS Surface Flux Data Set v2.0, OAFlux evaporation and CMORPH precipitation. The study area, initially referred to the North Atlantic, is extended over two additional ocean basins and the study period covers the 2011-2012 timeframe. Yearly, seasonal

  11. Sensitivity of a Floodplain Hydrodynamic Model to Satellite-Based DEM Scale and Accuracy: Case Study—The Atchafalaya Basin

    Directory of Open Access Journals (Sweden)

    Hahn Chul Jung

    2015-06-01

    Full Text Available The hydrodynamics of low-lying riverine floodplains and wetlands play a critical role in hydrology and ecosystem processes. Because small topographic features affect floodplain storage and flow velocity, a hydrodynamic model setup of these regions imposes more stringent requirements on the input Digital Elevation Model (DEM compared to upland regions with comparatively high slopes. This current study provides a systematic approach to evaluate the required relative vertical accuracy and spatial resolution of current and future satellite-based altimeters within the context of DEM requirements for 2-D floodplain hydrodynamic models. A case study is presented for the Atchafalaya Basin with a model domain of 1190 km2. The approach analyzes the sensitivity of modeled floodplain water elevation and velocity to typical satellite-based DEM grid-box scale and vertical error, using a previously calibrated version of the physically-based flood inundation model (LISFLOOD-ACC. Results indicate a trade-off relationship between DEM relative vertical error and grid-box size. Higher resolution models are the most sensitive to vertical accuracy, but the impact diminishes at coarser resolutions because of spatial averaging. The results provide guidance to engineers and scientists when defining the observation scales of future altimetry missions such as the   Surface Water and Ocean Topography (SWOT mission from the perspective of numerical modeling requirements for large floodplains of O[103] km2 and greater.

  12. Satellite-Based Evidence of Wavelength-Dependent Aerosol Absorption in Biomass Burning Smoke Inferred from Ozone Monitoring Instrument

    Science.gov (United States)

    Jethva, H.; Torres, O.

    2012-01-01

    We provide satellite-based evidence of the spectral dependence of absorption in biomass burning aerosols over South America using near-UV measurements made by the Ozone Monitoring Instrument (OMI) during 2005-2007. In the current near-UV OMI aerosol algorithm (OMAERUV), it is implicitly assumed that the only absorbing component in carbonaceous aerosols is black carbon whose imaginary component of the refractive index is wavelength independent. With this assumption, OMI-derived aerosol optical depth (AOD) is found to be significantly over-estimated compared to that of AERONET at several sites during intense biomass burning events (August-September). Other well-known sources of error affecting the near-UV method of aerosol retrieval do not explain the large observed AOD discrepancies between the satellite and the ground-based observations. A number of studies have revealed strong spectral dependence in carbonaceous aerosol absorption in the near-UV region suggesting the presence of organic carbon in biomass burning generated aerosols. A sensitivity analysis examining the importance of accounting for the presence of wavelength-dependent aerosol absorption in carbonaceous particles in satellite-based remote sensing was carried out in this work. The results convincingly show that the inclusion of spectrally-dependent aerosol absorption in the radiative transfer calculations leads to a more accurate characterization of the atmospheric load of carbonaceous aerosols.

  13. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ...

  14. Correction note.

    Science.gov (United States)

    2014-12-01

    Correction note for Sanders, M., Calam, R., Durand, M., Liversidge, T. and Carmont, S. A. (2008), Does self-directed and web-based support for parents enhance the effects of viewing a reality television series based on the Triple P - Positive Parenting Programme?. Journal of Child Psychology and Psychiatry, 49: 924-932. doi: 10.1111/j.1469-7610.2008.01901.x. © 2014 Association for Child and Adolescent Mental Health.

  15. Monitoring Crop Yield in USA Using a Satellite-Based Climate-Variability Impact Index

    Science.gov (United States)

    Zhang, Ping; Anderson, Bruce; Tan, Bin; Barlow, Mathew; Myneni, Ranga

    2011-01-01

    A quantitative index is applied to monitor crop growth and predict agricultural yield in continental USA. The Climate-Variability Impact Index (CVII), defined as the monthly contribution to overall anomalies in growth during a given year, is derived from 1-km MODIS Leaf Area Index. The growing-season integrated CVII can provide an estimate of the fractional change in overall growth during a given year. In turn these estimates can provide fine-scale and aggregated information on yield for various crops. Trained from historical records of crop production, a statistical model is used to produce crop yield during the growing season based upon the strong positive relationship between crop yield and the CVII. By examining the model prediction as a function of time, it is possible to determine when the in-season predictive capability plateaus and which months provide the greatest predictive capacity.

  16. Using NASA's Giovanni Web Portal to Access and Visualize Satellite-based Earth Science Data in the Classroom

    Science.gov (United States)

    Lloyd, Steven; Acker, James G.; Prados, Ana I.; Leptoukh, Gregory G.

    2008-01-01

    One of the biggest obstacles for the average Earth science student today is locating and obtaining satellite-based remote sensing data sets in a format that is accessible and optimal for their data analysis needs. At the Goddard Earth Sciences Data and Information Services Center (GES-DISC) alone, on the order of hundreds of Terabytes of data are available for distribution to scientists, students and the general public. The single biggest and time-consuming hurdle for most students when they begin their study of the various datasets is how to slog through this mountain of data to arrive at a properly sub-setted and manageable data set to answer their science question(s). The GES DISC provides a number of tools for data access and visualization, including the Google-like Mirador search engine and the powerful GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) web interface.

  17. Costs and benefits of satellite-based tools for irrigation management

    Directory of Open Access Journals (Sweden)

    Francesco eVuolo

    2015-07-01

    Full Text Available This paper presents the results of a collaborative work with farmers and a cost-benefit analysis of geospatial technologies applied to irrigation water management in the semi-arid agricultural area in Lower Austria. We use Earth observation (EO data to estimate crop evapotranspiration (ET and webGIS technologies to deliver maps and irrigation advice to farmers. The study reports the technical and qualitative evaluation performed during a demonstration phase in 2013 and provides an outlook to future developments. The calculation of the benefits is based on a comparison of the irrigation volumes estimated from satellite vs. the irrigation supplied by the farmers. In most cases, the amount of water supplied was equal to the maximum amount of water required by crops. At the same time high variability was observed for the different irrigation units and crop types. Our data clearly indicates that economic benefits could be achieved by reducing irrigation volumes, especially for water-intensive crops. Regarding the qualitative evaluation, most of the farmers expressed a very positive interest in the provided information. In particular, information related to crop ET was appreciated as this helps to make better informed decisions on irrigation. The majority of farmers (54% also expressed a general willingness to pay, either directly or via cost sharing, for such a service. Based on different cost scenarios, we calculated the cost of the service. Considering 20,000 ha regularly irrigated land, the advisory service would cost between 2.5 and 4.3 €/ha per year depending on the type of satellite data used. For comparison, irrigation costs range between 400 and 1000 €/ha per year for a typical irrigation volume of 2,000 cubic meters per ha. With a correct irrigation application, more than 10% of the water and energy could be saved in water-intensive crops, which is equivalent to an economic benefit of 40-100 €/ha per year.

  18. Satellite-based high-resolution mapping of rainfall over southern Africa

    Science.gov (United States)

    Meyer, Hanna; Drönner, Johannes; Nauss, Thomas

    2017-06-01

    A spatially explicit mapping of rainfall is necessary for southern Africa for eco-climatological studies or nowcasting but accurate estimates are still a challenging task. This study presents a method to estimate hourly rainfall based on data from the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI). Rainfall measurements from about 350 weather stations from 2010-2014 served as ground truth for calibration and validation. SEVIRI and weather station data were used to train neural networks that allowed the estimation of rainfall area and rainfall quantities over all times of the day. The results revealed that 60 % of recorded rainfall events were correctly classified by the model (probability of detection, POD). However, the false alarm ratio (FAR) was high (0.80), leading to a Heidke skill score (HSS) of 0.18. Estimated hourly rainfall quantities were estimated with an average hourly correlation of ρ = 0. 33 and a root mean square error (RMSE) of 0.72. The correlation increased with temporal aggregation to 0.52 (daily), 0.67 (weekly) and 0.71 (monthly). The main weakness was the overestimation of rainfall events. The model results were compared to the Integrated Multi-satellitE Retrievals for GPM (IMERG) of the Global Precipitation Measurement (GPM) mission. Despite being a comparably simple approach, the presented MSG-based rainfall retrieval outperformed GPM IMERG in terms of rainfall area detection: GPM IMERG had a considerably lower POD. The HSS was not significantly different compared to the MSG-based retrieval due to a lower FAR of GPM IMERG. There were no further significant differences between the MSG-based retrieval and GPM IMERG in terms of correlation with the observed rainfall quantities. The MSG-based retrieval, however, provides rainfall in a higher spatial resolution. Though estimating rainfall from satellite data remains challenging, especially at high temporal resolutions, this study showed promising results

  19. A New Algorithm for the Satellite-Based Retrieval of Solar Surface Irradiance in Spectral Bands

    Directory of Open Access Journals (Sweden)

    Annette Hammer

    2012-03-01

    Full Text Available Accurate solar surface irradiance data is a prerequisite for an efficient planning and operation of solar energy systems. Further, it is essential for climate monitoring and analysis. Recently, the demand on information about spectrally resolved solar surface irradiance has grown. As surface measurements are rare, satellite derived information with high accuracy might fill this gap. This paper describes a new approach for the retrieval of spectrally resolved solar surface irradiance from satellite data. The method combines a eigenvector-hybrid look-up table approach for the clear sky case with satellite derived cloud transmission (Heliosat method. The eigenvector LUT approach is already used to retrieve the broadband solar surface irradiance of data sets provided by the Climate Monitoring Satellite Application Facility (CM-SAF. This paper describes the extension of this approach to wavelength bands and the combination with spectrally resolved cloud transmission values derived with radiative transfer corrections of the broadband cloud transmission. Thus, the new approach is based on radiative transfer modeling and enables the use of extended information about the atmospheric state, among others, to resolve the effect of water vapor and ozone absorption bands. The method is validated with spectrally resolved measurements from two sites in Europe and by comparison with radiative transfer calculations. The validation results demonstrate the ability of the method to retrieve accurate spectrally resolved irradiance from satellites. The accuracy is in the range of the uncertainty of surface measurements, with exception of the UV and NIR ( ≥ 1200 nm part of the spectrum, where higher deviations occur.

  20. Validation and Application of the Modified Satellite-Based Priestley-Taylor Algorithm for Mapping Terrestrial Evapotranspiration

    Directory of Open Access Journals (Sweden)

    Yunjun Yao

    2014-01-01

    Full Text Available Satellite-based vegetation indices (VIs and Apparent Thermal Inertia (ATI derived from temperature change provide valuable information for estimating evapotranspiration (LE and detecting the onset and severity of drought. The modified satellite-based Priestley-Taylor (MS-PT algorithm that we developed earlier, coupling both VI and ATI, is validated based on observed data from 40 flux towers distributed across the world on all continents. The validation results illustrate that the daily LE can be estimated with the Root Mean Square Error (RMSE varying from 10.7 W/m2 to 87.6 W/m2, and with the square of correlation coefficient (R2 from 0.41 to 0.89 (p < 0.01. Compared with the Priestley-Taylor-based LE (PT-JPL algorithm, the MS-PT algorithm improves the LE estimates at most flux tower sites. Importantly, the MS-PT algorithm is also satisfactory in reproducing the inter-annual variability at flux tower sites with at least five years of data. The R2 between measured and predicted annual LE anomalies is 0.42 (p = 0.02. The MS-PT algorithm is then applied to detect the variations of long-term terrestrial LE over Three-North Shelter Forest Region of China and to monitor global land surface drought. The MS-PT algorithm described here demonstrates the ability to map regional terrestrial LE and identify global soil moisture stress, without requiring precipitation information.

  1. Using satellite-based measurements to explore spatiotemporal scales and variability of drivers of new particle formation

    Science.gov (United States)

    Sullivan, R. C.; Crippa, P.; Hallar, A. G.; Clarisse, L.; Whitburn, S.; Van Damme, M.; Leaitch, W. R.; Walker, J. T.; Khlystov, A.; Pryor, S. C.

    2016-10-01

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently spatially averaged) measurements of atmospheric conditions to diagnose the occurrence of NPF and NPF characteristics. We demonstrate the potential for using satellite-based measurements of insolation (UV), trace gas concentrations (sulfur dioxide (SO2), nitrogen dioxide (NO2), ammonia (NH3), formaldehyde (HCHO), and ozone (O3)), aerosol optical properties (aerosol optical depth (AOD) and Ångström exponent (AE)), and a proxy of biogenic volatile organic compound emissions (leaf area index (LAI) and temperature (T)) as predictors for NPF characteristics: formation rates, growth rates, survival probabilities, and ultrafine particle (UFP) concentrations at five locations across North America. NPF at all sites is most frequent in spring, exhibits a one-day autocorrelation, and is associated with low condensational sink (AOD × AE) and HCHO concentrations, and high UV. However, there are important site-to-site variations in NPF frequency and characteristics, and in which of the predictor variables (particularly gas concentrations) significantly contribute to the explanatory power of regression models built to predict those characteristics. This finding may provide a partial explanation for the reported spatial variability in skill of simple generalized nucleation schemes in reproducing observed NPF. In contrast to more simple proxies developed in prior studies (e.g., based on AOD, AE, SO2, and UV), use of additional predictors (NO2, NH3, HCHO, LAI, T, and O3) increases the explained temporal variance of UFP concentrations at all sites.

  2. A comparison of two satellite-based evapotranspiration models in the Nagqu river basin of the Tibetan Plateau

    Science.gov (United States)

    Zou, Mijun; Zhong, Lei; Ma, Yaoming; Su, Bob; Ma, Weiqiang

    2017-04-01

    Evapotranspiration (ET), the combination of surface evaporation and vegetation transpiration, is the most uncertain component of eco-hydrological systems because it is constrained by large number of controlling factors. The measurements obtained from some eddy-flux towers which have been set-up in the Tibetan Plateau are still insufficient for providing accurate estimations of ET over a heterogeneous area, while satellite-based ET approaches have become more feasible for determining ET at multi-scale. In this study, the estimated ET using two satellite-based models: topographical enhanced surface energy balance system (TESEBS) and Priestley-Taylor (PT) based approaches, were validated and inter-compared in the Nagqu river basin under cloudless conditions. Remote sensing data (SPOT Vegetation data and TERRA MODIS data) and meteorological data in 2003 were used for 10-day ET estimation. As input parameters for ET calculation, broadband albedo and downward shortwave radiation flux (SWD) were improved. NDVI was reconstructed before coupled into models. The ET determined by the combinatory method, which is based on the surface layer gradient measurements, was treated as the actual ET and used for validation with model results. The results showed that: (1) ET determined from both TESEBS and PT models corresponded well with the actual ET with correlation coefficient of 0.882 and 0.817. (2) However, TESEBS showed better performance than PT model with lower mean bias error (-0.021 mm/h) and root mean square error (0.079 mm/h). (3) Although PT approach is simple in computation and fewer parameters are required, the high weight of NDVI would lead to some overestimations especially in monsoon season.

  3. Validation and in vivo assessment of an innovative satellite-based solar UV dosimeter for a mobile app dedicated to skin health.

    Science.gov (United States)

    Morelli, M; Masini, A; Simeone, E; Khazova, M

    2016-08-31

    We present an innovative satellite-based solar UV (ultraviolet) radiation dosimeter with a mobile app interface that has been validated by exploiting both ground-based measurements and an in vivo assessment of the erythemal effects on some volunteers having controlled exposure to solar radiation. The app with this satellite-based UV dosimeter also includes other related functionalities such as the provision of safe sun exposure time updated in real-time and end exposure visual/sound alert. Both validations showed that the system has a good accuracy and reliability needed for health-related applications. This app will be launched on the market by siHealth Ltd in May 2016 under the name of "HappySun" and is available for both Android and iOS devices (more info on ). Extensive R&D activities are on-going for the further improvement of the satellite-based UV dosimeter's accuracy.

  4. Satellite-based high-resolution mapping of rainfall over southern Africa

    Directory of Open Access Journals (Sweden)

    H. Meyer

    2017-06-01

    Full Text Available A spatially explicit mapping of rainfall is necessary for southern Africa for eco-climatological studies or nowcasting but accurate estimates are still a challenging task. This study presents a method to estimate hourly rainfall based on data from the Meteosat Second Generation (MSG Spinning Enhanced Visible and Infrared Imager (SEVIRI. Rainfall measurements from about 350 weather stations from 2010–2014 served as ground truth for calibration and validation. SEVIRI and weather station data were used to train neural networks that allowed the estimation of rainfall area and rainfall quantities over all times of the day. The results revealed that 60 % of recorded rainfall events were correctly classified by the model (probability of detection, POD. However, the false alarm ratio (FAR was high (0.80, leading to a Heidke skill score (HSS of 0.18. Estimated hourly rainfall quantities were estimated with an average hourly correlation of ρ = 0. 33 and a root mean square error (RMSE of 0.72. The correlation increased with temporal aggregation to 0.52 (daily, 0.67 (weekly and 0.71 (monthly. The main weakness was the overestimation of rainfall events. The model results were compared to the Integrated Multi-satellitE Retrievals for GPM (IMERG of the Global Precipitation Measurement (GPM mission. Despite being a comparably simple approach, the presented MSG-based rainfall retrieval outperformed GPM IMERG in terms of rainfall area detection: GPM IMERG had a considerably lower POD. The HSS was not significantly different compared to the MSG-based retrieval due to a lower FAR of GPM IMERG. There were no further significant differences between the MSG-based retrieval and GPM IMERG in terms of correlation with the observed rainfall quantities. The MSG-based retrieval, however, provides rainfall in a higher spatial resolution. Though estimating rainfall from satellite data remains challenging, especially at high temporal resolutions, this study showed

  5. Incorporating quantitative single photon emission computed tomography into radiation therapy treatment planning for lung cancer: impact of attenuation and scatter correction on the single photon emission computed tomography-weighted mean dose and functional lung segmentation.

    Science.gov (United States)

    Yin, Lingshu; Shcherbinin, Sergey; Celler, Anna; Thompson, Anna; Fua, Tsien-Fei; Liu, Mitchell; Duzenli, Cheryl; Gill, Brad; Sheehan, Finbar; Powe, John; Worsley, Daniel; Marks, Lawrence; Moiseenko, Vitali

    2010-10-01

    To assess the impact of attenuation and scatter corrections on the calculation of single photon emission computed tomography (SPECT)-weighted mean dose (SWMD) and functional volume segmentation as applied to radiation therapy treatment planning for lung cancer. Nine patients with lung cancer underwent a SPECT lung perfusion scan. For each scan, four image sets were reconstructed using the ordered subsets expectation maximization method with attenuation and scatter corrections ranging from none to a most comprehensive combination of attenuation corrections and direct scatter modeling. Functional volumes were segmented in each reconstructed image using 10%, 20%, …, 90% of maximum SPECT intensity as a threshold. Systematic effects of SPECT reconstruction methods on treatment planning using functional volume were studied by calculating size and spatial agreements of functional volumes, and V(20) for functional volume from actual treatment plans. The SWMD was calculated for radiation beams with a variety of possible gantry angles and field sizes. Functional volume segmentation is sensitive to the particular method of SPECT reconstruction used. Large variations in functional volumes, as high as >50%, were observed in SPECT images reconstructed with different attenuation/scatter corrections. However, SWMD was less sensitive to the type of scatter corrections. SWMD was consistent within 2% for all reconstructions as long as computed tomography-based attenuation correction was used. When using perfusion SPECT images during treatment planning optimization/evaluation, the SWMD may be the preferred figure of merit, as it is less affected by reconstruction technique, compared with threshold-based functional volume segmentation. 2010 Elsevier Inc. All rights reserved.

  6. Global Monitoring RSEM System for Crop Production by Incorporating Satellite-based Photosynthesis Rates and Anomaly Data of Sea Surface Temperature

    Science.gov (United States)

    Kaneko, D.; Sakuma, H.

    2014-12-01

    The first author has been developing RSEM crop-monitoring system using satellite-based assessment of photosynthesis, incorporating meteorological conditions. Crop production comprises of several stages and plural mechanisms based on leaf photosynthesis, surface energy balance, and the maturing of grains after fixation of CO2, along with water exchange through soil vegetation-atmosphere transfer. Grain production in prime countries appears to be randomly perturbed regionally and globally. Weather for crop plants reflects turbulent phenomena of convective and advection flows in atmosphere and surface boundary layer. It has been difficult for scientists to simulate and forecast weather correctly for sufficiently long terms to crop harvesting. However, severely poor harvests related to continental events must originate from a consistent mechanism of abnormal energetic flow in the atmosphere through both land and oceans. It should be remembered that oceans have more than 100 times of energy storage compared to atmosphere and ocean currents represent gigantic energy flows, strongly affecting climate. Anomalies of Sea Surface Temperature (SST), globally known as El Niño, Indian Ocean dipole, and Atlantic Niño etc., affect the seasonal climate on a continental scale. The authors aim to combine monitoring and seasonal forecasting, considering such mechanisms through land-ocean biosphere transfer. The present system produces assessments for all continents, specifically monitoring agricultural fields of main crops. Historical regions of poor and good harvests are compared with distributions of SST anomalies, which are provided by NASA GSFC. Those comparisons fairly suggest that the Worst harvest in 1993 and the Best in 1994 relate to the offshore distribution of low temperature anomalies and high gaps in ocean surface temperatures. However, high-temperature anomalies supported good harvests because of sufficient solar radiation for photosynthesis, and poor harvests because

  7. Using NASA's Giovanni Web Portal to Access and Visualize Satellite-Based Earth Science Data in the Classroom

    Science.gov (United States)

    Lloyd, S. A.; Acker, J. G.; Prados, A. I.; Leptoukh, G. G.

    2008-12-01

    One of the biggest obstacles for the average Earth science student today is locating and obtaining satellite- based remote sensing datasets in a format that is accessible and optimal for their data analysis needs. At the Goddard Earth Sciences Data and Information Services Center (GES-DISC) alone, on the order of hundreds of Terabytes of data are available for distribution to scientists, students and the general public. The single biggest and time-consuming hurdle for most students when they begin their study of the various datasets is how to slog through this mountain of data to arrive at a properly sub-setted and manageable dataset to answer their science question(s). The GES DISC provides a number of tools for data access and visualization, including the Google-like Mirador search engine and the powerful GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) web interface. Giovanni provides a simple way to visualize, analyze and access vast amounts of satellite-based Earth science data. Giovanni's features and practical examples of its use will be demonstrated, with an emphasis on how satellite remote sensing can help students understand recent events in the atmosphere and biosphere. Giovanni is actually a series of sixteen similar web-based data interfaces, each of which covers a single satellite dataset (such as TRMM, TOMS, OMI, AIRS, MLS, HALOE, etc.) or a group of related datasets (such as MODIS and MISR for aerosols, SeaWIFS and MODIS for ocean color, and the suite of A-Train observations co-located along the CloudSat orbital path). Recently, ground-based datasets have been included in Giovanni, including the Northern Eurasian Earth Science Partnership Initiative (NEESPI), and EPA fine particulate matter (PM2.5) for air quality. Model data such as the Goddard GOCART model and MERRA meteorological reanalyses (in process) are being increasingly incorporated into Giovanni to facilitate model- data intercomparison. A full suite of data

  8. Satellite-based evidence of wavelength-dependent aerosol absorption in biomass burning smoke inferred from Ozone Monitoring Instrument

    Directory of Open Access Journals (Sweden)

    H. Jethva

    2011-10-01

    Full Text Available We provide satellite-based evidence of the spectral dependence of absorption in biomass burning aerosols over South America using near-UV measurements made by the Ozone Monitoring Instrument (OMI during 2005–2007. In the current near-UV OMI aerosol algorithm (OMAERUV, it is implicitly assumed that the only absorbing component in carbonaceous aerosols is black carbon whose imaginary component of the refractive index is wavelength independent. With this assumption, OMI-derived aerosol optical depth (AOD is found to be significantly over-estimated compared to that of AERONET at several sites during intense biomass burning events (August-September. Other well-known sources of error affecting the near-UV method of aerosol retrieval do not explain the large observed AOD discrepancies between the satellite and the ground-based observations. A number of studies have revealed strong spectral dependence in carbonaceous aerosol absorption in the near-UV region suggesting the presence of organic carbon in biomass burning generated aerosols. A sensitivity analysis examining the importance of accounting for the presence of wavelength-dependent aerosol absorption in carbonaceous particles in satellite-based remote sensing was carried out in this work. The results convincingly show that the inclusion of spectrally-dependent aerosol absorption in the radiative transfer calculations leads to a more accurate characterization of the atmospheric load of carbonaceous aerosols. The use of a new set of aerosol models assuming wavelength-dependent aerosol absorption in the near-UV region (Absorption Angstrom Exponent λ−2.5 to −3.0 improved the OMAERUV retrieval results by significantly reducing the AOD bias observed when gray aerosols were assumed. In addition, the new retrieval of single-scattering albedo is in better agreement with those of AERONET within the uncertainties (ΔSSA = ±0.03. The new colored carbonaceous aerosol model was also found to

  9. Relation between Ocean SST Dipoles and Downwind Continental Croplands Assessed for Early Management Using Satellite-based Photosynthesis Models

    Science.gov (United States)

    Kaneko, Daijiro

    2015-04-01

    Crop-monitoring systems with the unit of carbon-dioxide sequestration for environmental issues related to climate adaptation to global warming have been improved using satellite-based photosynthesis and meteorological conditions. Early management of crop status is desirable for grain production, stockbreeding, and bio-energy providing that the seasonal climate forecasting is sufficiently accurate. Incorrect seasonal forecasting of crop production can damage global social activities if the recognized conditions are unsatisfied. One cause of poor forecasting related to the atmospheric dynamics at the Earth surface, which reflect the energy budget through land surface, especially the oceans and atmosphere. Recognition of the relation between SST anomalies (e.g. ENSO, Atlantic Niño, Indian dipoles, and Ningaloo Niño) and crop production, as expressed precisely by photosynthesis or the sequestrated-carbon rate, is necessary to elucidate the mechanisms related to poor production. Solar radiation, surface air temperature, and water stress all directly affect grain vegetation photosynthesis. All affect stomata opening, which is related to the water balance or definition by the ratio of the Penman potential evaporation and actual transpiration. Regarding stomata, present data and reanalysis data give overestimated values of stomata opening because they are extended from wet models in forests rather than semi-arid regions commonly associated with wheat, maize, and soybean. This study applies a complementary model based on energy conservation for semi-arid zones instead of the conventional Penman-Monteith method. Partitioning of the integrated Net PSN enables precise estimation of crop yields by modifying the semi-closed stomata opening. Partitioning predicts production more accurately using the cropland distribution already classified using satellite data. Seasonal crop forecasting should include near-real-time monitoring using satellite-based process crop models to avoid

  10. Assesing the temporal and spatial performance of satellite-based rainfall estimates across the complex topographical and climatic gradients of Chile

    Science.gov (United States)

    Zambrano-Bigiarini, Mauricio; Nauditt, Alexandra; Birkel, Christian; Verbist, Koen; Ribbe, Lars

    2017-04-01

    Accurate representation of the real spatio-temporal variability of catchment rainfall inputs is currently severely limited. Moreover, spatially interpolated catchment precipitation is subject to large uncertainties, particularly in developing countries and regions which are difficult to access. Recently, satellite-based rainfall estimates (SRE) provide an unprecedented opportunity for a wide range of hydrological applications, from water resources modelling to monitoring of extreme events such as droughts and floods. This study attempts to exhaustively evaluate -for the first time- the suitability of seven state-of-the-art SRE products (TMPA 3B42v7, CHIRPSv2, CMORPH, PERSIANN-CDR, PERSIAN-CCS-adj, MSWEPv1.1 and PGFv3) over the complex topography and diverse climatic gradients of Chile. Different temporal scales (daily, monthly, seasonal, annual) are used in a point-to-pixel comparison between precipitation time series measured at 366 stations (from sea level to 4600 m a.s.l. in the Andean Plateau) and the corresponding grid cell of each SRE (rescaled to a 0.25° grid if necessary). The modified Kling-Gupta efficiency was used to identify possible sources of systematic errors in each SRE. In addition, five categorical indices (PC, POD, FAR, ETS, fBIAS) were used to assess the ability of each SRE to correctly identify different precipitation intensities. Results revealed that most SRE products performed better for the humid South (36.4-43.7°S) and Central Chile (32.18-36.4°S), in particular at low- and mid-elevation zones (0-1000 m a.s.l.) compared to the arid northern regions and the Far South. Seasonally, all products performed best during the wet seasons autumn and winter (MAM-JJA) compared to summer (DJF) and spring (SON). In addition, all SREs were able to correctly identify the occurrence of no rain events, but they presented a low skill in classifying precipitation intensities during rainy days. Overall, PGFv3 exhibited the best performance everywhere and

  11. Cross-validation Methodology between Ground and GPM Satellite-based Radar Rainfall Product over Dallas-Fort Worth (DFW) Metroplex

    Science.gov (United States)

    Chen, H.; Chandrasekar, V.; Biswas, S.

    2015-12-01

    Over the past two decades, a large number of rainfall products have been developed based on satellite, radar, and/or rain gauge observations. However, to produce optimal rainfall estimation for a given region is still challenging due to the space time variability of rainfall at many scales and the spatial and temporal sampling difference of different rainfall instruments. In order to produce high-resolution rainfall products for urban flash flood applications and improve the weather sensing capability in urban environment, the center for Collaborative Adaptive Sensing of the Atmosphere (CASA), in collaboration with National Weather Service (NWS) and North Central Texas Council of Governments (NCTCOG), has developed an urban radar remote sensing network in DFW Metroplex. DFW is the largest inland metropolitan area in the U.S., that experiences a wide range of natural weather hazards such as flash flood and hailstorms. The DFW urban remote sensing network, centered by the deployment of eight dual-polarization X-band radars and a NWS WSR-88DP radar, is expected to provide impacts-based warning and forecasts for benefit of the public safety and economy. High-resolution quantitative precipitation estimation (QPE) is one of the major goals of the development of this urban test bed. In addition to ground radar-based rainfall estimation, satellite-based rainfall products for this area are also of interest for this study. Typical example is the rainfall rate product produced by the Dual-frequency Precipitation Radar (DPR) onboard Global Precipitation Measurement (GPM) Core Observatory satellite. Therefore, cross-comparison between ground and space-based rainfall estimation is critical to building an optimal regional rainfall system, which can take advantages of the sampling differences of different sensors. This paper presents the real-time high-resolution QPE system developed for DFW urban radar network, which is based upon the combination of S-band WSR-88DP and X

  12. A comparision between satellite based and drone based remote sensing technology to achieve sustainable development: a review

    Directory of Open Access Journals (Sweden)

    Babankumar Bansod

    2017-12-01

    Full Text Available Precision agriculture is a way to manage the crop yield resources like water, fertilizers, soil, seeds in order to increase production, quality, gain and reduce squander products so that the existing system become eco-friendly. The main target of precision agriculture is to match resources and execution according to the crop and climate to ameliorate the effects of Praxis. Global Positioning System, Geographic Information System, Remote sensing technologies and various sensors are used in Precision farming for identifying the variability in field and using different methods to deal with them. Satellite based remote sensing is used to study the variability in crop and ground but suffer from various disadvantageous such as prohibited use, high price, less revisiting them, poor resolution due to great height, Unmanned Aerial Vehicle (UAV is other alternative option for application in precision farming. UAV overcomes the drawback of the ground based system, i.e. inaccessibility to muddy and very dense regions. Hovering at a peak of 500 meter - 1000 meter is good enough to offer various advantageous in image acquisition such as high spatial and temporal resolution, full flexibility, low cost. Recent studies of application of UAV in precision farming indicate advanced designing of UAV, enhancement in georeferencing and the mosaicking of image, analysis and extraction of information required for supplying a true end product to farmers. This paper also discusses the various platforms of UAV used in farming applications, its technical constraints, seclusion rites, reliability and safety.

  13. Improving Snow Process Modeling with Satellite-Based Estimation of Near-Surface-Air-Temperature Lapse Rate

    Science.gov (United States)

    Wang, Lei

    2017-04-01

    In distributed hydrological modeling, surface air temperature (Tair) is of great importance in simulating cold region processes, while the near-surface-air-temperature lapse rate (NLR) is crucial to prepare Tair (when interpolating Tair from site observations to model grids). In this study, a distributed biosphere hydrological model with improved snow physics (WEB-DHM-S) was rigorously evaluated in a typical cold, large river basin (e.g., the upper Yellow River basin), given a mean monthly NLRs. Based on the validated model, we have examined the influence of the NLR on the simulated snow processes and streamflows. We found that the NLR has a large effect on the simulated streamflows, with a maximum difference of greater than 24 % among the various scenarios for NLRs considered. To supplement the insufficient number of monitoring sites for near-surface-air-temperature at developing/undeveloped mountain regions, the nighttime MODIS LST is used as an alternative to derive the approximate NLR at a finer spatial scale (e.g., at different elevation bands, different land covers, different aspects, and different snow conditions). Using satellite-based estimation of NLR, the modeling of snow processes has been greatly refined. Results show that both the determination of rainfall/snowfall and the snow pack process were significantly improved, contributing to a reduced summer evapotranspiration and thus an improved streamflow simulation.

  14. Assessment of the aerosol optical depths measured by satellite-based passive remote sensors in the Alberta oil sands region

    Science.gov (United States)

    Sioris, Christopher E.; McLinden, Chris A.; Shephard, Mark W.; Fioletov, Vitali E.; Abboud, Ihab

    2017-02-01

    Several satellite aerosol optical depth (AOD) products are assessed in terms of their data quality in the Alberta oil sands region. The instruments consist of MODIS (Moderate Resolution Imaging Spectroradiometer), POLDER (Polarization and Directionality of Earth Reflectances), MISR (Multi-angle Imaging SpectroRadiometer), and AATSR (Advanced Along-Track Scanning Radiometer). The AOD data products are examined in terms of multiplicative and additive biases determined using local Aerosol Robotic Network (AERONET) (AEROCAN) stations. Correlation with ground-based data is used to assess whether the satellite-based AODs capture day-to-day, month-to-month, and spatial variability. The ability of the satellite AOD products to capture interannual variability is assessed at Albian mine and Shell Muskeg River, two neighbouring sites in the northern mining region where a statistically significant positive trend (2002-2015) in PM2.5 mass density exists. An increasing trend of similar amplitude (˜ 5 % year-1) is observed in this northern mining region using some of the satellite AOD products.

  15. Exploring the uncertainty associated with satellite-based estimates of premature mortality due to exposure to fine particulate matter

    Directory of Open Access Journals (Sweden)

    B. Ford

    2016-03-01

    Full Text Available The negative impacts of fine particulate matter (PM2.5 exposure on human health are a primary motivator for air quality research. However, estimates of the air pollution health burden vary considerably and strongly depend on the data sets and methodology. Satellite observations of aerosol optical depth (AOD have been widely used to overcome limited coverage from surface monitoring and to assess the global population exposure to PM2.5 and the associated premature mortality. Here we quantify the uncertainty in determining the burden of disease using this approach, discuss different methods and data sets, and explain sources of discrepancies among values in the literature. For this purpose we primarily use the MODIS satellite observations in concert with the GEOS-Chem chemical transport model. We contrast results in the United States and China for the years 2004–2011. Using the Burnett et al. (2014 integrated exposure response function, we estimate that in the United States, exposure to PM2.5 accounts for approximately 2 % of total deaths compared to 14 % in China (using satellite-based exposure, which falls within the range of previous estimates. The difference in estimated mortality burden based solely on a global model vs. that derived from satellite is approximately 14 % for the US and 2 % for China on a nationwide basis, although regionally the differences can be much greater. This difference is overshadowed by the uncertainty in the methodology for deriving PM2.5 burden from satellite observations, which we quantify to be on the order of 20 % due to uncertainties in the AOD-to-surface-PM2.5 relationship, 10 % due to the satellite observational uncertainty, and 30 % or greater uncertainty associated with the application of concentration response functions to estimated exposure.

  16. Real-Time Global Flood Estimation Using Satellite-Based Precipitation and a Coupled Land Surface and Routing Model

    Science.gov (United States)

    Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian

    2014-01-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.

  17. Satellite-based observations of rain-induced NOx emissions from soils around Lake Chad in the Sahel

    Science.gov (United States)

    Zörner, Jan; Penning de Vries, Marloes; Dörner, Steffen; Sihler, Holger; Beirle, Steffen; Wagner, Thomas

    2017-04-01

    Rain-induced emission pulses of NOx (≡ NO + NO2) from soils have been observed in many semi-arid regions over the world. They are induced by the first precipitation of the wet season and are mainly caused by the sudden re-activation of microbes in the soil releasing reactive nitrogen. In this study, a single intense event of pulsed NOx emissions from soils around Lake Chad is investigated. This is achieved by analysing daily tropospheric NO2 vertical column densities (VCDs) as observed by the satellite-based OMI instrument together with other satellite and model data on precipitation, lightning, fire and wind. The study region of Lake Chad and its ecosystems are indispensable to life in the Sahel region. Climate variability and unsustainable water utilization, however, caused a drastic decrease in the lakes' surface area which, in turn, lead to extensive land cover changes converting former lake area to shrub land and fertile farm land. The results indicate that the region of Lake Chad does not only show consistent enhancements in average NO2 VCDs in the early months of the wet season compared to its surrounding desert but also exhibits particularly strong NOx emissions shortly after a single large-scale precipitation event in June 2007. NO2 VCDs measured 14 hours after this precipitation event show strong enhancements (2.5*1015 molecules cm-2) compared to the seasonal background VCDs and, moreover, represent the highest detected NO2 VCDs of the entire year. Detailed analysis of potential contributors to the observed NO2 VCDs strongly indicate that fire, lightning and retrieval artefacts cannot explain the NO2 pulse. The estimated emission flux from the soil, calculated based on mass balance, amounts to about 32.3 ng N m-2 s-1, which corresponds to about 65 tonnes of nitrogen released to the atmosphere within one day.

  18. Hydrological real-time modelling in the Zambezi river basin using satellite-based soil moisture and rainfall data

    Directory of Open Access Journals (Sweden)

    P. Meier

    2011-03-01

    Full Text Available Reliable real-time forecasts of the discharge can provide valuable information for the management of a river basin system. For the management of ecological releases even discharge forecasts with moderate accuracy can be beneficial. Sequential data assimilation using the Ensemble Kalman Filter provides a tool that is both efficient and robust for a real-time modelling framework. One key parameter in a hydrological system is the soil moisture, which recently can be characterized by satellite based measurements. A forecasting framework for the prediction of discharges is developed and applied to three different sub-basins of the Zambezi River Basin. The model is solely based on remote sensing data providing soil moisture and rainfall estimates. The soil moisture product used is based on the back-scattering intensity of a radar signal measured by a radar scatterometer. These soil moisture data correlate well with the measured discharge of the corresponding watershed if the data are shifted by a time lag which is dependent on the size and the dominant runoff process in the catchment. This time lag is the basis for the applicability of the soil moisture data for hydrological forecasts. The conceptual model developed is based on two storage compartments. The processes modeled include evaporation losses, infiltration and percolation. The application of this model in a real-time modelling framework yields good results in watersheds where soil storage is an important factor. The lead time of the forecast is dependent on the size and the retention capacity of the watershed. For the largest watershed a forecast over 40 days can be provided. However, the quality of the forecast increases significantly with decreasing prediction time. In a watershed with little soil storage and a quick response to rainfall events, the performance is relatively poor and the lead time is as short as 10 days only.

  19. A New Temperature-Vegetation Triangle Algorithm with Variable Edges (TAVE for Satellite-Based Actual Evapotranspiration Estimation

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2016-09-01

    Full Text Available The estimation of spatially-variable actual evapotranspiration (AET is a critical challenge to regional water resources management. We propose a new remote sensing method, the Triangle Algorithm with Variable Edges (TAVE, to generate daily AET estimates based on satellite-derived land surface temperature and the vegetation index NDVI. The TAVE captures heterogeneity in AET across elevation zones and permits variability in determining local values of wet and dry end-member classes (known as edges. Compared to traditional triangle methods, TAVE introduces three unique features: (i the discretization of the domain as overlapping elevation zones; (ii a variable wet edge that is a function of elevation zone; and (iii variable values of a combined-effect parameter (that accounts for aerodynamic and surface resistance, vapor pressure gradient, and soil moisture availability along both wet and dry edges. With these features, TAVE effectively addresses the combined influence of terrain and water stress on semi-arid environment AET estimates. We demonstrate the effectiveness of this method in one of the driest countries in the world—Jordan, and compare it to a traditional triangle method (TA and a global AET product (MOD16 over different land use types. In irrigated agricultural lands, TAVE matched the results of the single crop coefficient model (−3%, in contrast to substantial overestimation by TA (+234% and underestimation by MOD16 (−50%. In forested (non-irrigated, water consuming regions, TA and MOD16 produced AET average deviations 15.5 times and −3.5 times of those based on TAVE. As TAVE has a simple structure and low data requirements, it provides an efficient means to satisfy the increasing need for evapotranspiration estimation in data-scarce semi-arid regions. This study constitutes a much needed step towards the satellite-based quantification of agricultural water consumption in Jordan.

  20. Comparison of Different Machine Learning Approaches for Monthly Satellite-Based Soil Moisture Downscaling over Northeast China

    Directory of Open Access Journals (Sweden)

    Yangxiaoyue Liu

    2017-12-01

    Full Text Available Although numerous satellite-based soil moisture (SM products can provide spatiotemporally continuous worldwide datasets, they can hardly be employed in characterizing fine-grained regional land surface processes, owing to their coarse spatial resolution. In this study, we proposed a machine-learning-based method to enhance SM spatial accuracy and improve the availability of SM data. Four machine learning algorithms, including classification and regression trees (CART, K-nearest neighbors (KNN, Bayesian (BAYE, and random forests (RF, were implemented to downscale the monthly European Space Agency Climate Change Initiative (ESA CCI SM product from 25-km to 1-km spatial resolution. During the regression, the land surface temperature (including daytime temperature, nighttime temperature, and diurnal fluctuation temperature, normalized difference vegetation index, surface reflections (red band, blue band, NIR band and MIR band, and digital elevation model were taken as explanatory variables to produce fine spatial resolution SM. We chose Northeast China as the study area and acquired corresponding SM data from 2003 to 2012 in unfrozen seasons. The reconstructed SM datasets were validated against in-situ measurements. The results showed that the RF-downscaled results had superior matching performance to both ESA CCI SM and in-situ measurements, and can positively respond to precipitation variation. Additionally, the RF was less affected by parameters, which revealed its robustness. Both CART and KNN ranked second. Compared to KNN, CART had a relatively close correlation with the validation data, but KNN showed preferable precision. Moreover, BAYE ranked last with significantly abnormal regression values.

  1. Correcting false positive medium-chain acyl-CoA dehydrogenase deficiency results from newborn screening; synthesis, purification, and standardization of branched-chain C8 acylcarnitines for use in their selective and accurate absolute quantitation by UHPLC-MS/MS.

    Science.gov (United States)

    Minkler, Paul E; Stoll, Maria S K; Ingalls, Stephen T; Hoppel, Charles L

    2017-04-01

    While selectively quantifying acylcarnitines in thousands of patient samples using UHPLC-MS/MS, we have occasionally observed unidentified branched-chain C8 acylcarnitines. Such observations are not possible using tandem MS methods, which generate pseudo-quantitative acylcarnitine "profiles". Since these "profiles" select for mass alone, they cannot distinguish authentic signal from isobaric and isomeric interferences. For example, some of the samples containing branched-chain C8 acylcarnitines were, in fact, expanded newborn screening false positive "profiles" for medium-chain acyl-CoA dehydrogenase deficiency (MCADD). Using our fast, highly selective, and quantitatively accurate UHPLC-MS/MS acylcarnitine determination method, we corrected the false positive tandem MS results and reported the sample results as normal for octanoylcarnitine (the marker for MCADD). From instances such as these, we decided to further investigate the presence of branched-chain C8 acylcarnitines in patient samples. To accomplish this, we synthesized and chromatographically characterized several branched-chain C8 acylcarnitines (in addition to valproylcarnitine): 2-methylheptanoylcarnitine, 6-methylheptanoylcarnitine, 2,2-dimethylhexanoylcarnitine, 3,3-dimethylhexanoylcarnitine, 3,5-dimethylhexanoylcarnitine, 2-ethylhexanoylcarnitine, and 2,4,4-trimethylpentanoylcarnitine. We then compared their behavior with branched-chain C8 acylcarnitines observed in patient samples and demonstrated our ability to chromographically resolve, and thus distinguish, octanoylcarnitine from branched-chain C8 acylcarnitines, correcting false positive MCADD results from expanded newborn screening. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Science.gov (United States)

    2013-12-12

    ... cross- references, correcting grammatical errors, revising language for clarity and consistency, and... practice. Specifically, these amendments are to correct grammatical errors and to revise cross-references.... The final rule contained minor errors in grammar, punctuation, and referencing. This document corrects...

  3. A Machine Learning and Cross-Validation Approach for the Discrimination of Vegetation Physiognomic Types Using Satellite Based Multispectral and Multitemporal Data

    OpenAIRE

    Ram C. Sharma; Keitarou Hara; Hidetake Hirayama

    2017-01-01

    This paper presents the performance and evaluation of a number of machine learning classifiers for the discrimination between the vegetation physiognomic classes using the satellite based time-series of the surface reflectance data. Discrimination of six vegetation physiognomic classes, Evergreen Coniferous Forest, Evergreen Broadleaf Forest, Deciduous Coniferous Forest, Deciduous Broadleaf Forest, Shrubs, and Herbs, was dealt with in the research. Rich-feature data were prepared from time-se...

  4. Quantitation of regional cerebral blood flow corrected for partial volume effect using O-15 water and PET: II. Normal values and gray matter blood flow response to visual activation

    DEFF Research Database (Denmark)

    Law, I; Iida, H; Holm, S

    2000-01-01

    One of the most limiting factors for the accurate quantification of physiologic parameters with positron emission tomography (PET) is the partial volume effect (PVE). To assess the magnitude of this contribution to the measurement of regional cerebral blood flow (rCBF), the authors have formulated...... matter flow. Furthermore, rCBF based on the autoradiographic method was measured. The goals of the study were to determine the following in normal humans: (1) the optimal model, (2) the optimal length of fit, (3) the model parameters and their reproducibility, and (4) the effects of data acquisition (2D......CBF in normal humans. The potential use of this method is to cost-effectively deliver PVE corrected measures of rCBF and tissue volumes without reference to imaging modalities other than PET...

  5. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... and Craniofacial Surgery Cleft Lip/Palate and Craniofacial Surgery A cleft lip may require one or more ... find out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment ...

  6. Quantitative easing

    OpenAIRE

    Faustino, Rui Alexandre Rodrigues Veloso

    2012-01-01

    A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics Since November 2008, the Federal Reserve of the United States pursued a series of large-scale asset purchases, known as Quantitative Easing. In this Work Project, I describe the context, the objectives and the implementation of the Quantitative Easing. Additionally, I discuss its expected effects. Finally, I present empirical evidence of the ...

  7. Calibration of a large-scale hydrological model using satellite-based soil moisture and evapotranspiration products

    Science.gov (United States)

    López López, Patricia; Sutanudjaja, Edwin H.; Schellekens, Jaap; Sterk, Geert; Bierkens, Marc F. P.

    2017-06-01

    A considerable number of river basins around the world lack sufficient ground observations of hydro-meteorological data for effective water resources assessment and management. Several approaches can be developed to increase the quality and availability of data in these poorly gauged or ungauged river basins; among them, the use of Earth observations products has recently become promising. Earth observations of various environmental variables can be used potentially to increase knowledge about the hydrological processes in the basin and to improve streamflow model estimates, via assimilation or calibration. The present study aims to calibrate the large-scale hydrological model PCRaster GLOBal Water Balance (PCR-GLOBWB) using satellite-based products of evapotranspiration and soil moisture for the Moroccan Oum er Rbia River basin. Daily simulations at a spatial resolution of 5 × 5 arcmin are performed with varying parameters values for the 32-year period 1979-2010. Five different calibration scenarios are inter-compared: (i) reference scenario using the hydrological model with the standard parameterization, (ii) calibration using in situ-observed discharge time series, (iii) calibration using the Global Land Evaporation Amsterdam Model (GLEAM) actual evapotranspiration time series, (iv) calibration using ESA Climate Change Initiative (CCI) surface soil moisture time series and (v) step-wise calibration using GLEAM actual evapotranspiration and ESA CCI surface soil moisture time series. The impact on discharge estimates of precipitation in comparison with model parameters calibration is investigated using three global precipitation products, including ERA-Interim (EI), WATCH Forcing methodology applied to ERA-Interim reanalysis data (WFDEI) and Multi-Source Weighted-Ensemble Precipitation data by merging gauge, satellite and reanalysis data (MSWEP). Results show that GLEAM evapotranspiration and ESA CCI soil moisture may be used for model calibration resulting in

  8. Segmentation-based MR attenuation correction including bones also affects quantitation in brain studies: an initial result of 18F-FP-CIT PET/MR for patients with parkinsonism.

    Science.gov (United States)

    Choi, Hongyoon; Cheon, Gi Jeong; Kim, Han-Joon; Choi, Seung Hong; Lee, Jae Sung; Kim, Yong-Il; Kang, Keon Wook; Chung, June-Key; Kim, E Edmund; Lee, Dong Soo

    2014-10-01

    Attenuation correction (AC) with an ultrashort echo time (UTE) sequence has recently been used in combination with segmentation for cortical bone identification for brain PET/MR studies. The purpose of this study was to evaluate the quantification of (18)F-fluoropropyl-carbomethoxyiodophenylnortropane ((18)F-FP-CIT) binding in brain PET/MR, particularly focusing on effects of UTE-based AC including bone segmentation. Sixteen patients with initially suspected parkinsonism were prospectively enrolled. An emission scan was acquired 110 min after (18)F-FP-CIT injection on a dedicated PET/MR scanner, immediately followed by another emission scan using a PET/CT scanner 120 min after the injection. A UTE-based attenuation map was used to classify the voxels into 3 tissues: bone, soft tissue, and air. All PET images were spatially normalized, and a specific-to-nonspecific dopamine transporter (DAT) binding ratio (BR) was calculated using statistical probabilistic anatomic mapping. The level of agreement was assessed with intraclass correlation coefficients (ICCs). Voxelwise comparison between PET images acquired from PET/MR and PET/CT was performed. We compared non-attenuation-corrected images to analyze UTE-based AC effects on DAT quantification. BR in the putamen obtained from PET/MR and PET/CT showed low interequipment variability, whereas BR in the caudate nucleus showed significant variability (ICC = 0.967 and 0.682 for putamen and caudate nucleus, respectively). BR in the caudate nucleus was significantly underestimated by PET/MR, compared with PET/CT (mean difference of BR = 0.66, P PET/MR showed significantly low BR in the periventricular regions, which was caused by a misclassification of the ventricle as air on the attenuation map. We also compared non-AC images, revealing low interequipment variability even in the caudate nucleus (ICC = 0.937 and 0.832 for putamen and caudate nucleus, respectively). Our data demonstrate spatial bias of the DAT BR on (18)F

  9. Aberration corrected Lorentz scanning transmission electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    McVitie, S., E-mail: stephen.mcvitie@glasgow.ac.uk; McGrouther, D.; McFadzean, S.; MacLaren, D.A.; O’Shea, K.J.; Benitez, M.J.

    2015-05-15

    We present results from an aberration corrected scanning transmission electron microscope which has been customised for high resolution quantitative Lorentz microscopy with the sample located in a magnetic field free or low field environment. We discuss the innovations in microscope instrumentation and additional hardware that underpin the imaging improvements in resolution and detection with a focus on developments in differential phase contrast microscopy. Examples from materials possessing nanometre scale variations in magnetisation illustrate the potential for aberration corrected Lorentz imaging as a tool to further our understanding of magnetism on this lengthscale. - Highlights: • Demonstration of nanometre scale resolution in magnetic field free environment using aberration correction in the scanning transmission electron microscope (STEM). • Implementation of differential phase contrast mode of Lorentz microscopy in aberration corrected STEM with improved sensitivity. • Quantitative imaging of magnetic induction of nanostructures in amorphous and cross-section samples.

  10. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  11. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  12. Quantitative Literacy.

    Science.gov (United States)

    Daniele, Vincent A.

    1993-01-01

    Quantitative literacy for students with deafness is addressed, noting work by the National Council of Teachers of Mathematics to establish curriculum standards for grades K-12. The standards stress problem solving, communication, reasoning, making mathematical connections, and the need for educators of the deaf to pursue mathematics literacy with…

  13. Do agrometeorological data improve optical satellite-based estimations of the herbaceous yield in Sahelian semi-arid ecosystems?

    DEFF Research Database (Denmark)

    Diouf, Abdoul Aziz; Hiernaux, Pierre; Brandt, Martin Stefan

    2016-01-01

    Quantitative estimates of forage availability at the end of the growing season in rangelands are helpful for pastoral livestock managers and for local, national and regional stakeholders in natural resource management. For this reason, remote sensing data such as the Fraction of Absorbed Photosyn......Quantitative estimates of forage availability at the end of the growing season in rangelands are helpful for pastoral livestock managers and for local, national and regional stakeholders in natural resource management. For this reason, remote sensing data such as the Fraction of Absorbed...... in relative inter-annual variation. In particular, the additional use of agrometeorological information mitigated the saturation effects that characterize the plant indices of areas with high plant productivity. In addition, the date of the onset of the growing season derived from smoothed FAPAR seasonal...

  14. Non-Gaussian data assimilation of satellite-based leaf area index observations with an individual-based dynamic global vegetation model

    Science.gov (United States)

    Arakida, Hazuki; Miyoshi, Takemasa; Ise, Takeshi; Shima, Shin-ichiro; Kotsuki, Shunji

    2017-09-01

    We developed a data assimilation system based on a particle filter approach with the spatially explicit individual-based dynamic global vegetation model (SEIB-DGVM). We first performed an idealized observing system simulation experiment to evaluate the impact of assimilating the leaf area index (LAI) data every 4 days, simulating the satellite-based LAI. Although we assimilated only LAI as a whole, the tree and grass LAIs were estimated separately with high accuracy. Uncertain model parameters and other state variables were also estimated accurately. Therefore, we extended the experiment to the real world using the real Moderate Resolution Imaging Spectroradiometer (MODIS) LAI data and obtained promising results.

  15. Attenuation correction for small animal PET tomographs

    Energy Technology Data Exchange (ETDEWEB)

    Chow, Patrick L [David Geffen School of Medicine at UCLA, Crump Institute for Molecular Imaging, University of California, 700 Westwood Plaza, Los Angeles, CA 90095 (United States); Rannou, Fernando R [Departamento de Ingenieria Informatica, Universidad de Santiago de Chile (USACH), Av. Ecuador 3659, Santiago (Chile); Chatziioannou, Arion F [David Geffen School of Medicine at UCLA, Crump Institute for Molecular Imaging, University of California, 700 Westwood Plaza, Los Angeles, CA 90095 (United States)

    2005-04-21

    Attenuation correction is one of the important corrections required for quantitative positron emission tomography (PET). This work will compare the quantitative accuracy of attenuation correction using a simple global scale factor with traditional transmission-based methods acquired either with a small animal PET or a small animal x-ray computed tomography (CT) scanner. Two phantoms (one mouse-sized and one rat-sized) and two animal subjects (one mouse and one rat) were scanned in CTI Concorde Microsystem's microPET (registered) Focus{sup TM} for emission and transmission data and in ImTek's MicroCAT{sup TM} II for transmission data. PET emission image values were calibrated against a scintillation well counter. Results indicate that the scale factor method of attenuation correction places the average measured activity concentration about the expected value, without correcting for the cupping artefact from attenuation. Noise analysis in the phantom studies with the PET-based method shows that noise in the transmission data increases the noise in the corrected emission data. The CT-based method was accurate and delivered low-noise images suitable for both PET data correction and PET tracer localization.

  16. NWS Corrections to Observations

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Form B-14 is the National Weather Service form entitled 'Notice of Corrections to Weather Records.' The forms are used to make corrections to observations on forms...

  17. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... Jaw Surgery Download Download the ebook for further information Corrective jaw, or orthognathic surgery is performed by ... your treatment. Correction of Common Dentofacial Deformities ​ ​ The information provided here is not intended as a substitute ...

  18. The Aerosol Index and Land Cover Class Based Atmospheric Correction Aerosol Optical Depth Time Series 1982–2014 for the SMAC Algorithm

    Directory of Open Access Journals (Sweden)

    Emmihenna Jääskeläinen

    2017-10-01

    Full Text Available Atmospheric effects, especially aerosols, are a significant source of uncertainty for optical remote sensing of surface parameters, such as albedo. Also to achieve a homogeneous surface albedo time series, the atmospheric correction has to be homogeneous. However, a global homogeneous aerosol optical depth (AOD time series covering several decades did not previously exist. Therefore, we have constructed an AOD time series 1982–2014 using aerosol index (AI data from the satellite measurements of the Total Ozone Mapping Spectrometer (TOMS and the Ozone Monitoring Instrument (OMI, together with the Solar zenith angle and land use classification data. It is used as input for the Simplified Method for Atmospheric Correction (SMAC algorithm when processing the surface albedo time series CLARA-A2 SAL (the Surface ALbedo from the Satellite Application Facility on Climate Monitoring project cLoud, Albedo and RAdiation data record, the second release. The surface reflectance simulations using the SMAC algorithm for different sets of satellite-based AOD data show that the aerosol-effect correction using the constructed TOMS/OMI based AOD data is comparable to using other satellite-based AOD data available for a shorter time range. Moreover, using the constructed TOMS/OMI based AOD as input for the atmospheric correction typically produces surface reflectance [-20]values closer to those obtained using in situ AOD values than when using other satellite-based AOD data.

  19. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens

    2013-09-01

    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  20. Timing jitter correction for THz-TDS measurements of graphene

    DEFF Research Database (Denmark)

    Whelan, Patrick Rebsdorf; Iwaszczuk, Krzysztof; Bøggild, Peter

    2016-01-01

    We discuss how noncontact, quantitative large-area mapping of the conductance of thin films requires delicate corrections in order to deduce electrical properties such as graphene mobility from THz-TDS measurements....

  1. Long-term analysis of aerosol optical depth over Northeast Asia using a satellite-based measurement: MI Yonsei Aerosol Retrieval Algorithm (YAER)

    Science.gov (United States)

    Kim, Mijin; Kim, Jhoon; Yoon, Jongmin; Chung, Chu-Yong; Chung, Sung-Rae

    2017-04-01

    In 2010, the Korean geostationary earth orbit (GEO) satellite, the Communication, Ocean, and Meteorological Satellite (COMS), was launched including the Meteorological Imager (MI). The MI measures atmospheric condition over Northeast Asia (NEA) using a single visible channel centered at 0.675 μm and four IR channels at 3.75, 6.75, 10.8, 12.0 μm. The visible measurement can also be utilized for the retrieval of aerosol optical properties (AOPs). Since the GEO satellite measurement has an advantage for continuous monitoring of AOPs, we can analyze the spatiotemporal variation of the aerosol using the MI observations over NEA. Therefore, we developed an algorithm to retrieve aerosol optical depth (AOD) using the visible observation of MI, and named as MI Yonsei Aerosol Retrieval Algorithm (YAER). In this study, we investigated the accuracy of MI YAER AOD by comparing the values with the long-term products of AERONET sun-photometer. The result showed that the MI AODs were significantly overestimated than the AERONET values over bright surface in low AOD case. Because the MI visible channel centered at red color range, contribution of aerosol signal to the measured reflectance is relatively lower than the surface contribution. Therefore, the AOD error in low AOD case over bright surface can be a fundamental limitation of the algorithm. Meanwhile, an assumption of background aerosol optical depth (BAOD) could result in the retrieval uncertainty, also. To estimate the surface reflectance by considering polluted air condition over the NEA, we estimated the BAOD from the MODIS dark target (DT) aerosol products by pixel. The satellite-based AOD retrieval, however, largely depends on the accuracy of the surface reflectance estimation especially in low AOD case, and thus, the BAOD could include the uncertainty in surface reflectance estimation of the satellite-based retrieval. Therefore, we re-estimated the BAOD using the ground-based sun-photometer measurement, and

  2. Satellite-Based Thermophysical Analysis of Volcaniclastic Deposits: A Terrestrial Analog for Mantled Lava Flows on Mars

    Directory of Open Access Journals (Sweden)

    Mark A. Price

    2016-02-01

    Full Text Available Orbital thermal infrared (TIR remote sensing is an important tool for characterizing geologic surfaces on Earth and Mars. However, deposition of material from volcanic or eolian activity results in bedrock surfaces becoming significantly mantled over time, hindering the accuracy of TIR compositional analysis. Moreover, interplay between particle size, albedo, composition and surface roughness add complexity to these interpretations. Apparent Thermal Inertia (ATI is the measure of the resistance to temperature change and has been used to determine parameters such as grain/block size, density/mantling, and the presence of subsurface soil moisture/ice. Our objective is to document the quantitative relationship between ATI derived from orbital visible/near infrared (VNIR and thermal infrared (TIR data and tephra fall mantling of the Mono Craters and Domes (MCD in California, which were chosen as an analog for partially mantled flows observed at Arsia Mons volcano on Mars. The ATI data were created from two images collected ~12 h apart by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER instrument. The results were validated with a quantitative framework developed using fieldwork that was conducted at 13 pre-chosen sites. These sites ranged in grain size from ash-sized to meter-scale blocks and were all rhyolitic in composition. Block size and mantling were directly correlated with ATI. Areas with ATI under 2.3 × 10−2 were well-mantled with average grain size below 4 cm; whereas values greater than 3.0 × 10−2 corresponded to mantle-free surfaces. Correlation was less accurate where checkerboard-style mixing between mantled and non-mantled surfaces occurred below the pixel scale as well as in locations where strong shadowing occurred. However, the results validate that the approach is viable for a large majority of mantled surfaces on Earth and Mars. This is relevant for determining the volcanic history of Mars, for

  3. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    Science.gov (United States)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  4. Wi-Fi and Satellite-Based Location Techniques for Intelligent Agricultural Machinery Controlled by a Human Operator

    Directory of Open Access Journals (Sweden)

    Domagoj Drenjanac

    2014-10-01

    Full Text Available In the new agricultural scenarios, the interaction between autonomous tractors and a human operator is important when they jointly perform a task. Obtaining and exchanging accurate localization information between autonomous tractors and the human operator, working as a team, is a critical to maintaining safety, synchronization, and efficiency during the execution of a mission. An advanced localization system for both entities involved in the joint work, i.e., the autonomous tractors and the human operator, provides a basis for meeting the task requirements. In this paper, different localization techniques for a human operator and an autonomous tractor in a field environment were tested. First, we compared the localization performances of two global navigation satellite systems’ (GNSS receivers carried by the human operator: (1 an internal GNSS receiver built into a handheld device; and (2 an external DGNSS receiver with centimeter-level accuracy. To investigate autonomous tractor localization, a real-time kinematic (RTK-based localization system installed on autonomous tractor developed for agricultural applications was evaluated. Finally, a hybrid localization approach, which combines distance estimates obtained using a wireless scheme with the position of an autonomous tractor obtained using an RTK-GNSS system, is proposed. The hybrid solution is intended for user localization in unstructured environments in which the GNSS signal is obstructed. The hybrid localization approach has two components: (1 a localization algorithm based on the received signal strength indication (RSSI from the wireless environment; and (2 the acquisition of the tractor RTK coordinates when the human operator is near the tractor. In five RSSI tests, the best result achieved was an average localization error of 4 m. In tests of real-time position correction between rows, RMS error of 2.4 cm demonstrated that the passes were straight, as was desired for the

  5. Hydrological modeling of the Peruvian–Ecuadorian Amazon Basin using GPM-IMERG satellite-based precipitation dataset

    Directory of Open Access Journals (Sweden)

    R. Zubieta

    2017-07-01

    Full Text Available In the last two decades, rainfall estimates provided by the Tropical Rainfall Measurement Mission (TRMM have proven applicable in hydrological studies. The Global Precipitation Measurement (GPM mission, which provides the new generation of rainfall estimates, is now considered a global successor to TRMM. The usefulness of GPM data in hydrological applications, however, has not yet been evaluated over the Andean and Amazonian regions. This study uses GPM data provided by the Integrated Multi-satellite Retrievals (IMERG (product/final run as input to a distributed hydrological model for the Amazon Basin of Peru and Ecuador for a 16-month period (from March 2014 to June 2015 when all datasets are available. TRMM products (TMPA V7 and TMPA RT datasets and a gridded precipitation dataset processed from observed rainfall are used for comparison. The results indicate that precipitation data derived from GPM-IMERG correspond more closely to TMPA V7 than TMPA RT datasets, but both GPM-IMERG and TMPA V7 precipitation data tend to overestimate, compared to observed rainfall (by 11.1 and 15.7 %, respectively. In general, GPM-IMERG, TMPA V7 and TMPA RT correlate with observed rainfall, with a similar number of rain events correctly detected ( ∼  20 %. Statistical analysis of modeled streamflows indicates that GPM-IMERG is as useful as TMPA V7 or TMPA RT datasets in southern regions (Ucayali Basin. GPM-IMERG, TMPA V7 and TMPA RT do not properly simulate streamflows in northern regions (Marañón and Napo basins, probably because of the lack of adequate rainfall estimates in northern Peru and the Ecuadorian Amazon.

  6. Hydrological modeling of the Peruvian-Ecuadorian Amazon Basin using GPM-IMERG satellite-based precipitation dataset

    Science.gov (United States)

    Zubieta, Ricardo; Getirana, Augusto; Carlo Espinoza, Jhan; Lavado-Casimiro, Waldo; Aragon, Luis

    2017-07-01

    In the last two decades, rainfall estimates provided by the Tropical Rainfall Measurement Mission (TRMM) have proven applicable in hydrological studies. The Global Precipitation Measurement (GPM) mission, which provides the new generation of rainfall estimates, is now considered a global successor to TRMM. The usefulness of GPM data in hydrological applications, however, has not yet been evaluated over the Andean and Amazonian regions. This study uses GPM data provided by the Integrated Multi-satellite Retrievals (IMERG) (product/final run) as input to a distributed hydrological model for the Amazon Basin of Peru and Ecuador for a 16-month period (from March 2014 to June 2015) when all datasets are available. TRMM products (TMPA V7 and TMPA RT datasets) and a gridded precipitation dataset processed from observed rainfall are used for comparison. The results indicate that precipitation data derived from GPM-IMERG correspond more closely to TMPA V7 than TMPA RT datasets, but both GPM-IMERG and TMPA V7 precipitation data tend to overestimate, compared to observed rainfall (by 11.1 and 15.7 %, respectively). In general, GPM-IMERG, TMPA V7 and TMPA RT correlate with observed rainfall, with a similar number of rain events correctly detected ( ˜ 20 %). Statistical analysis of modeled streamflows indicates that GPM-IMERG is as useful as TMPA V7 or TMPA RT datasets in southern regions (Ucayali Basin). GPM-IMERG, TMPA V7 and TMPA RT do not properly simulate streamflows in northern regions (Marañón and Napo basins), probably because of the lack of adequate rainfall estimates in northern Peru and the Ecuadorian Amazon.

  7. SAT-MAP-CLIMATE project results[SATellite base bio-geophysical parameter MAPping and aggregation modelling for CLIMATE models

    Energy Technology Data Exchange (ETDEWEB)

    Bay Hasager, C.; Woetmann Nielsen, N.; Soegaard, H.; Boegh, E.; Hesselbjerg Christensen, J.; Jensen, N.O.; Schultz Rasmussen, M.; Astrup, P.; Dellwik, E.

    2002-08-01

    Earth Observation (EO) data from imaging satellites are analysed with respect to albedo, land and sea surface temperatures, land cover types and vegetation parameters such as the Normalized Difference Vegetation Index (NDVI) and the leaf area index (LAI). The observed parameters are used in the DMI-HIRLAM-D05 weather prediction model in order to improve the forecasting. The effect of introducing actual sea surface temperatures from NOAA AVHHR compared to climatological mean values, shows a more pronounced land-sea breeze effect which is also observable in field observations. The albedo maps from NOAA AVHRR are rather similar to the climatological mean values so for the HIRLAM model this is insignicant, yet most likely of some importance in the HIRHAM regional climate model. Land cover type maps are assigned local roughness values determined from meteorological field observations. Only maps with a spatial resolution around 25 m can adequately map the roughness variations of the typical patch size distribution in Denmark. A roughness map covering Denmark is aggregated (ie area-average non-linearly) by a microscale aggregation model that takes the non-linear turbulent responses of each roughness step change between patches in an arbitrary pattern into account. The effective roughnesses are calculated into a 15 km by 15 km grid for the HIRLAM model. The effect of hedgerows is included as an added roughness effect as a function of hedge density mapped from a digital vector map. Introducing the new effective roughness maps into the HIRLAM model appears to remedy on the seasonal wind speed bias over land and sea in spring. A new parameterisation on the effective roughness for scalar surface fluxes is developed and tested on synthetic data. Further is a method for the estimation the evapotranspiration from albedo, surface temperatures and NDVI succesfully compared to field observations. The HIRLAM predictions of water vapour at 12 GMT are used for atmospheric correction of

  8. Quantitative Computertomographie

    Directory of Open Access Journals (Sweden)

    Engelke K

    2002-01-01

    Full Text Available Die quantitative Computertomographie (QCT ist neben der Dual X-ray-Absorptiometry (DXA eine Standardmethode in der Osteodensitometrie. Wichtigste Meßorte, für die auch kommerzielle Lösungen existieren, sind die Lendenwirbelsäule (LWS und der distale Unterarm. Untersuchungen des Tibia- oder auch des Femurschaftes haben dagegen untergeordnete Bedeutung. Untersuchungen der LWS werden mit klinischen Ganzkörpertomographen durchgeführt. Dafür existieren spezielle Aufnahme- und Auswerteprotokolle. Für QCT-Messungen an peripheren Meßorten (pQCT, insbesondere am distalen Unterarm, wurden kompakte CT-Scanner entwickelt, die heute als Tischgeräte angeboten werden. Entscheidende Vorteile der QCT im Vergleich mit der DXA sind die exakte dreidimensionale Lokalisation des Meßvolumens, die isolierte Erfassung dieses Volumens ohne Überlagerung des umgebenden Gewebes und die Separation trabekulären und kortikalen Knochens. Mit QCT wird die Konzentration des Knochenmineralgehaltes innerhalb einer definierten Auswerteregion (ROI, region of interest bestimmt. Die Konzentration wird typischerweise als Knochenmineraldichte (BMD, bone mineral density bezeichnet und in g/cm3 angegeben. Dagegen wird mit dem projektiven Verfahren der DXA lediglich eine Flächenkonzentration in g/cm2 bestimmt, die in Analogie zur QCT als Flächendichte bezeichnet wird. Der Unterschied zwischen Dichte (QCT und Flächendichte (DXA wird aber in der Literatur meistens vernachlässigt.

  9. Quantitative Analysen

    Science.gov (United States)

    Hübner, Philipp

    Der heilige Gral jeglicher Analytik ist, den wahren Wert bestimmen zu können. Dies bedingt quantitative Messmethoden, welche in der molekularen Analytik nun seit einiger Zeit zur Verfügung stehen. Das generelle Problem bei der Quantifizierung ist, dass wir meistens den wahren Wert weder kennen noch bestimmen können! Aus diesem Grund behelfen wir uns mit Annäherungen an den wahren Wert, indem wir aus Laborvergleichsuntersuchungen den Median oder den (robusten) Mittelwert berechnen oder indem wir einen Erwartungswert (expected value) aufgrund der Herstellung des Probenmaterials berechnen. Bei diesen Versuchen der Annäherung an den wahren Wert findet beabsichtigterweise eine Normierung der Analytik statt, entweder nach dem demokratischen Prinzip, dass die Mehrheit bestimmt oder durch zur Verfügungsstellung von geeignetem zertifiziertem Referenzmaterial. Wir müssen uns folglich bewusst sein, dass durch dieses Vorgehen zwar garantiert wird, dass die Mehrheit der Analysenlaboratorien gleich misst, wir jedoch dabei nicht wissen, ob alle gleich gut oder allenfalls gleich schlecht messen.

  10. Teaching in Correctional Settings

    Science.gov (United States)

    de Koning, Mireille; Striedinger, Angelika

    2009-01-01

    In early 2009, Education International conducted a study amongst its member organisations on education in correctional settings in their respective countries. Findings reveal that education in correctional settings and the conditions of teachers working in them vary greatly between regions. Generally speaking, in most regions, but specifically in…

  11. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  12. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  13. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... can also invite bacteria that lead to gum disease. Click here to find out more. Who We ... can also invite bacteria that lead to gum disease. Click here to find out more. Corrective Jaw ...

  14. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... surgery, orthognathic surgery is performed to correct functional problems. Jaw Surgery can have a dramatic effect on ... without straining Chronic mouth breathing Sleep apnea (breathing problems when sleeping, including snoring) Your dentist, orthodontist and ...

  15. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... anesthesia, all forms of sedation and general anesthesia. Click here to find out more. Cleft Lip/Palate ... depending on the extent of the repair needed. Click here to find out more. Corrective Jaw Surgery ...

  16. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... for corrective jaw surgery: Difficulty chewing, or biting food Difficulty swallowing Chronic jaw or jaw joint (TMJ) ... a long-term commitment for you and your family, and will try to realistically estimate the time ...

  17. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... to correct a wide range of minor and major skeletal and dental irregularities, including the misalignment of ... including snoring) Your dentist, orthodontist and OMS will work together to determine whether you are a candidate ...

  18. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... find out more. Wisdom Teeth Management Wisdom Teeth Management An impacted ... and maxillofacial surgeon (OMS) to correct a wide range of minor and major skeletal and dental irregularities, ...

  19. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... Corrective Jaw Surgery Dental and Soft Tissue Surgery Dental Implant Surgery Facial Cosmetic Surgery Head, Neck and Oral Pathology Obstructive Sleep Apnea TMJ and Facial Pain Treatment of Facial Injury Wisdom Teeth Management Procedures ...

  20. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... It can also invite bacteria that lead to gum disease. Click here to find out more. Who We ... It can also invite bacteria that lead to gum disease. Click here to find out more. Corrective Jaw ...

  1. Quantitative ADF STEM: acquisition, analysis and interpretation

    Science.gov (United States)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  2. ICT: isotope correction toolbox.

    Science.gov (United States)

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Evaluation of Satellite-Based Precipitation Products from IMERG V04A and V03D, CMORPH and TMPA with Gauged Rainfall in Three Climatologic Zones in China

    Directory of Open Access Journals (Sweden)

    Guanghua Wei

    2017-12-01

    Full Text Available A critical evaluation of the newly released precipitation data set is very important for both the end users and data developers. Meanwhile, the evaluation may provide a benchmark for the product’s continued development and future improvement. To these ends, the four precipitation estimates including IMERG (the Integrated Multi-satellitE Retrievals for the Global Precipitation Measurement V04A, IMERG V03D, CMORPH (the Climate Prediction Center Morphing technique-CRT and TRMM (the Tropical Rainfall Measuring Mission 3B42 are systematically evaluated against the gauge precipitation estimates at multiple spatiotemporal scales from 1 June 2014 to 30 November 2015 over three different topographic and climatic watersheds in China. Meanwhile, the statistical methods are utilized to quantize the performance of the four satellite-based precipitation estimates. The results show that: (1 over the Tibetan Plateau cold region, among all products, IMERG V04A underestimates precipitation with the largest RB (−46.98% during the study period and the similar results are seen at the seasonal scale. However, IMERG V03D demonstrates the best performance according to RB (7.46%, RMSE (0.44 mm/day and RRMSE (28.37%. Except for in summer, TRMM 3B42 perform better than CMORPH according to RMSEs, RRMSEs and Rs; (2 within the semi-humid Huaihe River Basin, IMERG V04A has a slight advantage over the other three satellite-based precipitation products with the lowest RMSE (0.32 mm/day during the evaluation period and followed by IMERG V03D, TRMM 3B42 and CMORPH orderly; (3 over the arid/semi-arid Weihe River Basin, in comparison with the other three products, TRMM 3B42 demonstrates the best performance with the lowest RMSE (0.1 mm/day, RRMSE (8.44% and highest R (0.92 during the study period. Meanwhile, IMERG V03D perform better than IMERG V04A according all the statistical indicators; (4 in winter, IMERG V04A and IMERG V03D tend to underestimate the total precipitation

  4. A quantitative reconstruction software suite for SPECT imaging

    Science.gov (United States)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  5. Assessing the potential of satellite-based precipitation estimates for flood frequency analysis in ungauged or poorly gauged tributaries of China's Yangtze River basin

    Science.gov (United States)

    Gao, Zhen; Long, Di; Tang, Guoqiang; Zeng, Chao; Huang, Jiesheng; Hong, Yang

    2017-07-01

    Flood frequency analysis (FFA) is critical for water resources engineering projects, particularly the design of hydraulic structures such as dams and reservoirs. However, it is often difficult to implement FFA in ungauged or poorly gauged basins because of the lack of consistent and long-term records of streamflow observations. The objective of this study was to evaluate the utility of satellite-based precipitation estimates for performing FFA in two presumably ungauged tributaries, the Jialing and Tuojiang Rivers, of the upper Yangtze River. Annual peak flow series were simulated using the Coupled Routing and Excess STorage (CREST) hydrologic model. Flood frequency was estimated by fitting the Pearson type III distribution of both observed and modeled streamflow with historic floods. Comparison of satellite-based precipitation products with a ground-based daily precipitation dataset for the period 2002-2014 reveals that 3B42V7 outperformed 3B42RT. The 3B42V7 product also shows consistent reliability in streamflow simulation and FFA (e.g., relative errors -20%-5% in the Jialing River). The results also indicate that complex terrain, drainage area, and reservoir construction are important factors that impact hydrologic model performance. The larger basin (156,736 km2) is more likely to produce satisfactory results than the small basin (19,613 km2) under similar circumstances (e.g., Jialing/Tuojiang calibrated by 3B42V7 for the calibration period: NSCE = 0.71/0.56). Using the same calibrated parameter sets from the entire Jialing River basin, the 3B42V7/3B42RT-driven hydrologic model performs better for two tributaries of the Jialing River (e.g., for the calibration period, NSCE = 0.71/0.60 in the Qujiang River basin and 0.54/0.38 in the Fujiang River basin) than for the upper mainstem of the Jialing River (NSCE = 0.34/0.32), which has more cascaded reservoirs with all these tributaries treated as ungauged basins for model validation. Overall, this study underscores

  6. A Satellite-Based Assessment of the Distribution and Biomass of Submerged Aquatic Vegetation in the Optically Shallow Basin of Lake Biwa

    Directory of Open Access Journals (Sweden)

    Shweta Yadav

    2017-09-01

    Full Text Available Assessing the abundance of submerged aquatic vegetation (SAV, particularly in shallow lakes, is essential for effective lake management activities. In the present study we applied satellite remote sensing (a Landsat-8 image in order to evaluate the SAV coverage area and its biomass for the peak growth period, which is mainly in September or October (2013 to 2016, in the eutrophic and shallow south basin of Lake Biwa. We developed and validated a satellite-based water transparency retrieval algorithm based on the linear regression approach (R2 = 0.77 to determine the water clarity (2013–2016, which was later used for SAV classification and biomass estimation. For SAV classification, we used Spectral Mixture Analysis (SMA, a Spectral Angle Mapper (SAM, and a binary decision tree, giving an overall classification accuracy of 86.5% and SAV classification accuracy of 76.5% (SAV kappa coefficient 0.74, based on in situ measurements. For biomass estimation, a new Spectral Decomposition Algorithm was developed. The satellite-derived biomass (R2 = 0.79 for the SAV classified area gives an overall root-mean-square error (RMSE of 0.26 kg Dry Weight (DW m-2. The mapped SAV coverage area was 20% and 40% in 2013 and 2016, respectively. Estimated SAV biomass for the mapped area shows an increase in recent years, with values of 3390 t (tons, dry weight in 2013 as compared to 4550 t in 2016. The maximum biomass density (4.89 kg DW m-2 was obtained for a year with high water transparency (September 2014. With the change in water clarity, a slow change in SAV growth was noted from 2013 to 2016. The study shows that water clarity is important for the SAV detection and biomass estimation using satellite remote sensing in shallow eutrophic lakes. The present study also demonstrates the successful application of the developed satellite-based approach for SAV biomass estimation in the shallow eutrophic lake, which can be tested in other lakes.

  7. THE EFFECT OF CLOUD FRACTION ON THE RADIATIVE ENERGY BUDGET: The Satellite-Based GEWEX-SRB Data vs. the Ground-Based BSRN Measurements

    Science.gov (United States)

    Zhang, T.; Stackhouse, P. W.; Gupta, S. K.; Cox, S. J.; Mikovitz, J. C.; Nasa Gewex Srb

    2011-12-01

    The NASA GEWEX-SRB (Global Energy and Water cycle Experiment - Surface Radiation Budget) project produces and archives shortwave and longwave atmospheric radiation data at the top of the atmosphere (TOA) and the Earth's surface. The archive holds uninterrupted records of shortwave/longwave downward/upward radiative fluxes at 1 degree by 1 degree resolution for the entire globe. The latest version in the archive, Release 3.0, is available as 3-hourly, daily and monthly means, spanning 24.5 years from July 1983 to December 2007. Primary inputs to the models used to produce the data include: shortwave and longwave radiances from International Satellite Cloud Climatology Project (ISCCP) pixel-level (DX) data, cloud and surface properties derived therefrom, temperature and moisture profiles from GEOS-4 reanalysis product obtained from the NASA Global Modeling and Assimilation Office (GMAO), and column ozone amounts constituted from Total Ozone Mapping Spectrometer (TOMS), TIROS Operational Vertical Sounder (TOVS) archives, and Stratospheric Monitoring-group's Ozone Blended Analysis (SMOBA), an assimilation product from NOAA's Climate Prediction Center. The data in the archive have been validated systemically against ground-based measurements which include the Baseline Surface Radiation Network (BSRN) data, the World Radiation Data Centre (WRDC) data, and the Global Energy Balance Archive (GEBA) data, and generally good agreement has been achieved. In addition to all-sky radiative fluxes, the output data include clear-sky fluxes, cloud optical depth, cloud fraction and so on. The BSRN archive also includes observations that can be used to derive the cloud fraction, which provides a means for analyzing and explaining the SRB-BSRN flux differences. In this paper, we focus on the effect of cloud fraction on the surface shortwave flux and the level of agreement between the satellite-based SRB data and the ground-based BSRN data. The satellite and BSRN employ different

  8. Influence of refractive correction on ocular dominance

    Science.gov (United States)

    Nakayama, Nanami; Kawamorita, Takushi; Uozato, Hiroshi

    2010-07-01

    We investigated the effects of refractive correction and refractive defocus on the assessment of sensory ocular dominance. In 25 healthy subjects (4 males and 21 females) aged between 20 and 31 years, a quantitative measurement of sensory ocular dominance was performed with refractive correction and the addition of a positive lens on the dominant eye. Sensory ocular dominance was measured with a chart using binocular rivalry targets. The reversal point changed after the addition of a +1.00 D lens on the dominant eye in all subjects. However, sighting ocular dominance and stereopsis did not change after the addition of a positive lens on the dominant eye ( P > 0:05, Wilcoxon test). These results suggest that refractive correction affects sensory ocular dominance, indicating the possible development of a new type of occlusion for amblyopia in the future.

  9. Toward a Satellite-Based System of Sugarcane Yield Estimation and Forecasting in Smallholder Farming Conditions: A Case Study on Reunion Island

    Directory of Open Access Journals (Sweden)

    Julien Morel

    2014-07-01

    Full Text Available Estimating sugarcane biomass is difficult to achieve when working with highly variable spatial distributions of growing conditions, like on Reunion Island. We used a dataset of in-farm fields with contrasted climatic conditions and farming practices to compare three methods of yield estimation based on remote sensing: (1 an empirical relationship method with a growing season-integrated Normalized Difference Vegetation Index NDVI, (2 the Kumar-Monteith efficiency model, and (3 a forced-coupling method with a sugarcane crop model (MOSICAS and satellite-derived fraction of absorbed photosynthetically active radiation. These models were compared with the crop model alone and discussed to provide recommendations for a satellite-based system for the estimation of yield at the field scale. Results showed that the linear empirical model produced the best results (RMSE = 10.4 t∙ha−1. Because this method is also the simplest to set up and requires less input data, it appears that it is the most suitable for performing operational estimations and forecasts of sugarcane yield at the field scale. The main limitation is the acquisition of a minimum of five satellite images. The upcoming open-access Sentinel-2 Earth observation system should overcome this limitation because it will provide 10-m resolution satellite images with a 5-day frequency.

  10. Role of physical forcings and nutrient availability on the control of satellite-based chlorophyll a concentration in the coastal upwelling area of the Sicilian Channel

    Directory of Open Access Journals (Sweden)

    Bernardo Patti

    2010-08-01

    Full Text Available The northern sector of the Sicilian Channel is an area of favourable upwelling winds, which ought to support primary production. However, the values for primary production are low when compared with other Mediterranean areas and very low compared with the most biologically productive regions of the world’s oceans: California, the Canary Islands, Humboldt and Benguela. The aim of this study was to identify the main factors that limit phytoplankton biomass in the Sicilian Channel and modulate its monthly changes. We compared satellite-based estimates of chlorophyll a concentration in the Strait of Sicily with those observed in the four Eastern Boundary Upwelling Systems mentioned above and in other Mediterranean wind-induced coastal upwelling systems (the Alboran Sea, the Gulf of Lions and the Aegean Sea. Our results show that this low level of chlorophyll is mainly due to the low nutrient level in surface and sub-surface waters, independently of wind-induced upwelling intensity. Further, monthly changes in chlorophyll are mainly driven by the mixing of water column and wind-induced and/or circulation-related upwelling processes. Finally, primary production limitation due to the enhanced stratification processes resulting from the general warming trend of Mediterranean waters is not active over most of the coastal upwelling area off the southern Sicilian coast.

  11. A Machine Learning and Cross-Validation Approach for the Discrimination of Vegetation Physiognomic Types Using Satellite Based Multispectral and Multitemporal Data.

    Science.gov (United States)

    Sharma, Ram C; Hara, Keitarou; Hirayama, Hidetake

    2017-01-01

    This paper presents the performance and evaluation of a number of machine learning classifiers for the discrimination between the vegetation physiognomic classes using the satellite based time-series of the surface reflectance data. Discrimination of six vegetation physiognomic classes, Evergreen Coniferous Forest, Evergreen Broadleaf Forest, Deciduous Coniferous Forest, Deciduous Broadleaf Forest, Shrubs, and Herbs, was dealt with in the research. Rich-feature data were prepared from time-series of the satellite data for the discrimination and cross-validation of the vegetation physiognomic types using machine learning approach. A set of machine learning experiments comprised of a number of supervised classifiers with different model parameters was conducted to assess how the discrimination of vegetation physiognomic classes varies with classifiers, input features, and ground truth data size. The performance of each experiment was evaluated by using the 10-fold cross-validation method. Experiment using the Random Forests classifier provided highest overall accuracy (0.81) and kappa coefficient (0.78). However, accuracy metrics did not vary much with experiments. Accuracy metrics were found to be very sensitive to input features and size of ground truth data. The results obtained in the research are expected to be useful for improving the vegetation physiognomic mapping in Japan.

  12. HCG blood test - quantitative

    Science.gov (United States)

    ... blood test - quantitative; Beta-HCG blood test - quantitative; Pregnancy test - blood - quantitative ... of a screening test for Down syndrome. This test is also done to diagnose abnormal conditions not related to pregnancy that can raise HCG level.

  13. Model Correction Factor Method

    DEFF Research Database (Denmark)

    Christensen, Claus; Randrup-Thomsen, Søren; Morsing Johannesen, Johannes

    1997-01-01

    The model correction factor method is proposed as an alternative to traditional polynomial based response surface techniques in structural reliability considering a computationally time consuming limit state procedure as a 'black box'. The class of polynomial functions is replaced by a limit stat...

  14. Correction to ATel 10681

    Science.gov (United States)

    Wang, Xiaofeng

    2017-08-01

    We report a correction to the spectroscopic classification of two optical transients announced in ATel #10681. In the main text of the telegram, SN 2017giq and MASTER OT J033744.97+723159.0 should be classified as type Ic and type IIb supernovae, respectively, which were reversed in the original report. We apologize for any confusion caused by this typo error.

  15. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... can also invite bacteria that lead to gum disease. Click here to find out more. Who We Are Find a Surgeon News Videos Contact Administration of Anesthesia Cleft Lip/Palate and Craniofacial Surgery Corrective Jaw Surgery Dental and Soft Tissue Surgery Dental Implant Surgery Facial ...

  16. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 10. Error Correcting Codes How Numbers Protect Themselves. Priti Shankar. Series Article Volume 1 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  18. Text Induced Spelling Correction

    NARCIS (Netherlands)

    Reynaert, M.W.C.

    2004-01-01

    We present TISC, a language-independent and context-sensitive spelling checking and correction system designed to facilitate the automatic removal of non-word spelling errors in large corpora. Its lexicon is derived from a very large corpus of raw text, without supervision, and contains word

  19. Correctness of concurrent processes

    NARCIS (Netherlands)

    E.R. Olderog (Ernst-Rüdiger)

    1989-01-01

    textabstractA new notion of correctness for concurrent processes is introduced and investigated. It is a relationship P sat S between process terms P built up from operators of CCS [Mi 80], CSP [Ho 85] and COSY [LTS 79] and logical formulas S specifying sets of finite communication sequences as in

  20. 10. Correctness of Programs

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. Algorithms - Correctness of Programs. R K Shyamasundar. Series Article Volume 3 ... Author Affiliations. R K Shyamasundar1. Computer Science Group, Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India.

  1. Refraction corrections for surveying

    Science.gov (United States)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  2. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 1. Error Correcting Codes The Hamming Codes. Priti Shankar. Series Article Volume 2 Issue 1 January ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  3. Issues in Correctional Training and Casework. Correctional Monograph.

    Science.gov (United States)

    Wolford, Bruce I., Ed.; Lawrenz, Pam, Ed.

    The eight papers contained in this monograph were drawn from two national meetings on correctional training and casework. Titles and authors are: "The Challenge of Professionalism in Correctional Training" (Michael J. Gilbert); "A New Perspective in Correctional Training" (Jack Lewis); "Reasonable Expectations in Correctional Officer Training:…

  4. Quantitative Electron Nanodiffraction.

    Energy Technology Data Exchange (ETDEWEB)

    Spence, John [Arizona State Univ., Mesa, AZ (United States)

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  5. The use of land- and satellite-based precipitation radar to forecast debris flows and high water discharge: case study from June 2nd, 2016 in southern Norway.

    Science.gov (United States)

    Devoli, Graziella; Mengistu, Zelalem T.; Elo, Christoffer A.; Boje, Søren; Rønning, Snorre S.; Engeland, Kolbjørn; Lussana, Cristian

    2017-04-01

    The Norwegian flood- and landslide forecasting service at the Norwegian Water Resources and Energy Directorate (NVE) (www.varsom.no), has issued flood forecasts since 1989, and since 2013 the occurrence of many landslides events at regional level, due either to severe storms or intense snow melting, has been predicted. High intensity and short duration (less than 1 hour) rainfalls may cause sudden and abundant runoff that can entrain large quantities of loose sediments and originate debris flows. Intense convective rainstorms often develop quickly, especially during summer, and they are difficult to forecast and even to observe with a standard (synoptic) network of precipitation gauges. In those cases, the forecaster on duty can send warning messages for a very large area (encompassing many counties and many municipalities), because of the large spatial uncertainty of the prognoses and amount of rain. A standard sentence in the warning message is always included, recommending to the population to monitor the evolution of the rainstorm with weather radar products, which are available on institutional websites. In other cases, especially when the convective rainstorm is spatially confined in a small area and highly uncertain, the forecaster may choose to not issue any warning. The first situation yields false alarms for some areas, while the second situation could result in a missing event, if a landslide actually occurs. The Norwegian Meteorological Institute (MET) and NVE are working on a project to further promote the use of radar-derived products in landslides and flood forecasting. In this study, we focus on the description of a case study to present the potential of MET-NVE collaboration on the topic. As a case study, we have chosen a short-lived rainstorm occurred on June 2nd, 2016 in Motland (Rogaland county, Southern Norway), which had triggered 2 debris flows that were not forecasted. Land- and satellite-based weather radar and lighting data were used to

  6. Assimilation of GOES satellite-based convective initiation and cloud growth observations into the Rapid Refresh and HRRR systems to improve aviation forecast guidance

    Science.gov (United States)

    Mecikalski, John; Smith, Tracy; Weygandt, Stephen

    2014-05-01

    Latent heating profiles derived from GOES satellite-based cloud-top cooling rates are being assimilated into a retrospective version of the Rapid Refresh system (RAP) being run at the Global Systems Division. Assimilation of these data may help reduce the time lag for convection initiation (CI) in both the RAP model forecasts and in 3-km High Resolution Rapid Refresh (HRRR) model runs that are initialized off of the RAP model grids. These data may also improve both the location and organization of developing convective storm clusters, especially in the nested HRRR runs. These types of improvements are critical for providing better convective storm guidance around busy hub airports and aviation corridor routes, especially in the highly congested Ohio Valley - Northeast - Mid-Atlantic region. Additional work is focusing on assimilating GOES-R CI algorithm cloud-top cooling-based latent heating profiles directly into the HRRR model. Because of the small-scale nature of the convective phenomena depicted in the cloud-top cooling rate data (on the order of 1-4 km scale), direct assimilation of these data in the HRRR may be more effective than assimilation in the RAP. The RAP is an hourly assimilation system developed at NOAA/ESRL and was implemented at NCEP as a NOAA operational model in May 2012. The 3-km HRRR runs hourly out to 15 hours as a nest within the ESRL real-time experimental RAP. The RAP and HRRR both use the WRF ARW model core, and the Gridpoint Statistical Interpolation (GSI) is used within an hourly cycle to assimilate a wide variety of observations (including radar data) to initialize the RAP. Within this modeling framework, the cloud-top cooling rate-based latent heating profiles are applied as prescribed heating during the diabatic forward model integration part of the RAP digital filter initialization (DFI). No digital filtering is applied on the 3-km HRRR grid, but similar forward model integration with prescribed heating is used to assimilate

  7. Spatial and decadal variations in satellite-based terrestrial evapotranspiration and drought over Inner Mongolia Autonomous Region of China during 1982-2009

    Science.gov (United States)

    Zhang, Zhaolu; Kang, Hui; Yao, Yunjun; Fadhil, Ayad M.; Zhang, Yuhu; Jia, Kun

    2018-02-01

    Evapotranspiration ( ET) plays an important role in exchange of water budget and carbon cycles over the Inner Mongolia autonomous region of China (IMARC). However, the spatial and decadal variations in terrestrial ET and drought over the IMARC in the past was calculated by only using sparse meteorological point-based data which remain quite uncertain. In this study, by combining satellite and meteorology datasets, a satellite-based semi-empirical Penman ET (SEMI-PM) algorithm is used to estimate regional ET and evaporative wet index (EWI) calculated by the ratio of ET and potential ET ( PET) over the IMARC. Validation result shows that the square of the correlation coefficients (R2) for the four sites varies from 0.45 to 0.84 and the root-mean-square error (RMSE) is 0.78 mm. We found that the ET has decreased on an average of 4.8 mm per decade (p=0.10) over the entire IMARC during 1982-2009 and the EWI has decreased on an average of 1.1% per decade (p=0.08) during the study period. Importantly, the patterns of monthly EWI anomalies have a good spatial and temporal correlation with the Palmer Drought Severity Index (PDSI) anomalies from 1982 to 2009, indicating EWI can be used to monitor regional surface drought with high spatial resolution. In high-latitude ecosystems of northeast region of the IMARC, both air temperature (Ta) and incident solar radiation (Rs) are the most important parameters in determining ET. However, in semiarid and arid areas of the central and southwest regions of the IMARC, both relative humidity (RH) and normalized difference vegetation index (NDVI) are the most important factors controlling annual variation of ET.

  8. Estimating daily PM2.5 and PM10 across the complex geo-climate region of Israel using MAIAC satellite-based AOD data

    Science.gov (United States)

    Kloog, Itai; Sorek-Hamer, Meytar; Lyapustin, Alexei; Coull, Brent; Wang, Yujie; Just, Allan C.; Schwartz, Joel; Broday, David M.

    2017-01-01

    Estimates of exposure to PM2.5 are often derived from geographic characteristics based on land-use regression or from a limited number of fixed ground monitors. Remote sensing advances have integrated these approaches with satellite-based measures of aerosol optical depth (AOD), which is spatially and temporally resolved, allowing greater coverage for PM2.5 estimations. Israel is situated in a complex geo-climatic region with contrasting geographic and weather patterns, including both dark and bright surfaces within a relatively small area. Our goal was to examine the use of MODIS-based MAIAC data in Israel, and to explore the reliability of predicted PM2.5 and PM10 at a high spatiotemporal resolution. We applied a three stage process, including a daily calibration method based on a mixed effects model, to predict ground PM2.5 and PM10 over Israel. We later constructed daily predictions across Israel for 2003–2013 using spatial and temporal smoothing, to estimate AOD when satellite data were missing. Good model performance was achieved, with out-of-sample cross validation R2 values of 0.79 and 0.72 for PM10 and PM2.5, respectively. Model predictions had little bias, with cross-validated slopes (predicted vs. observed) of 0.99 for both the PM2.5 and PM10 models. To our knowledge, this is the first study that utilizes high resolution 1km MAIAC AOD retrievals for PM prediction while accounting for geo-climate complexities, such as experienced in Israel. This novel model allowed the reconstruction of long- and short-term spatially resolved exposure to PM2.5 and PM10 in Israel, which could be used in the future for epidemiological studies. PMID:28966551

  9. Estimating daily PM2.5 and PM10 across the complex geo-climate region of Israel using MAIAC satellite-based AOD data

    Science.gov (United States)

    Kloog, Itai; Sorek-Hamer, Meytar; Lyapustin, Alexei; Coull, Brent; Wang, Yujie; Just, Allan C.; Schwartz, Joel; Broday, David M.

    2015-12-01

    Estimates of exposure to PM2.5 are often derived from geographic characteristics based on land-use regression or from a limited number of fixed ground monitors. Remote sensing advances have integrated these approaches with satellite-based measures of aerosol optical depth (AOD), which is spatially and temporally resolved, allowing greater coverage for PM2.5 estimations. Israel is situated in a complex geo-climatic region with contrasting geographic and weather patterns, including both dark and bright surfaces within a relatively small area. Our goal was to examine the use of MODIS-based MAIAC data in Israel, and to explore the reliability of predicted PM2.5 and PM10 at a high spatiotemporal resolution. We applied a three stage process, including a daily calibration method based on a mixed effects model, to predict ground PM2.5 and PM10 over Israel. We later constructed daily predictions across Israel for 2003-2013 using spatial and temporal smoothing, to estimate AOD when satellite data were missing. Good model performance was achieved, with out-of-sample cross validation R2 values of 0.79 and 0.72 for PM10 and PM2.5, respectively. Model predictions had little bias, with cross-validated slopes (predicted vs. observed) of 0.99 for both the PM2.5 and PM10 models. To our knowledge, this is the first study that utilizes high resolution 1 km MAIAC AOD retrievals for PM prediction while accounting for geo-climate complexities, such as experienced in Israel. This novel model allowed the reconstruction of long- and short-term spatially resolved exposure to PM2.5 and PM10 in Israel, which could be used in the future for epidemiological studies.

  10. Estimating daily PM2.5 and PM10 across the complex geo-climate region of Israel using MAIAC satellite-based AOD data.

    Science.gov (United States)

    Kloog, Itai; Sorek-Hamer, Meytar; Lyapustin, Alexei; Coull, Brent; Wang, Yujie; Just, Allan C; Schwartz, Joel; Broday, David M

    2015-12-01

    Estimates of exposure to PM2.5 are often derived from geographic characteristics based on land-use regression or from a limited number of fixed ground monitors. Remote sensing advances have integrated these approaches with satellite-based measures of aerosol optical depth (AOD), which is spatially and temporally resolved, allowing greater coverage for PM2.5 estimations. Israel is situated in a complex geo-climatic region with contrasting geographic and weather patterns, including both dark and bright surfaces within a relatively small area. Our goal was to examine the use of MODIS-based MAIAC data in Israel, and to explore the reliability of predicted PM2.5 and PM10 at a high spatiotemporal resolution. We applied a three stage process, including a daily calibration method based on a mixed effects model, to predict ground PM2.5 and PM10 over Israel. We later constructed daily predictions across Israel for 2003-2013 using spatial and temporal smoothing, to estimate AOD when satellite data were missing. Good model performance was achieved, with out-of-sample cross validation R(2) values of 0.79 and 0.72 for PM10 and PM2.5, respectively. Model predictions had little bias, with cross-validated slopes (predicted vs. observed) of 0.99 for both the PM2.5 and PM10 models. To our knowledge, this is the first study that utilizes high resolution 1km MAIAC AOD retrievals for PM prediction while accounting for geo-climate complexities, such as experienced in Israel. This novel model allowed the reconstruction of long- and short-term spatially resolved exposure to PM2.5 and PM10 in Israel, which could be used in the future for epidemiological studies.

  11. Correction of refractive errors

    Directory of Open Access Journals (Sweden)

    Vladimir Pfeifer

    2005-10-01

    Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.

  12. Evaluation of QNI corrections in porous media applications

    Science.gov (United States)

    Radebe, M. J.; de Beer, F. C.; Nshimirimana, R.

    2011-09-01

    Qualitative measurements using digital neutron imaging has been the more explored aspect than accurate quantitative measurements. The reason for this bias is that quantitative measurements require correction for background and material scatter, and neutron spectral effects. Quantitative Neutron Imaging (QNI) software package has resulted from efforts at the Paul Scherrer Institute, Helmholtz Zentrum Berlin (HZB) and Necsa to correct for these effects, while the sample-detector distance (SDD) principle has previously been demonstrated as a measure to eliminate material scatter effect. This work evaluates the capabilities of the QNI software package to produce accurate quantitative results on specific characteristics of porous media, and its role to nondestructive quantification of material with and without calibration. The work further complements QNI abilities by the use of different SDDs. Studies of effective %porosity of mortar and attenuation coefficient of water using QNI and SDD principle are reported.

  13. Calculating correct compilers

    OpenAIRE

    Bahr, Patrick; Hutton, Graham

    2015-01-01

    In this article we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high- level semantics by systematic calculation, with all details of the implementation of the compilers falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language f...

  14. [Correct contact lens hygiene].

    Science.gov (United States)

    Blümle, S; Kaercher, T; Khaireddin, R

    2013-06-01

    Although contact lenses have long been established in ophthalmology, practical aspects of handling contact lenses is becoming increasingly less important in the clinical training as specialist for ophthalmology. Simultaneously, for many reasons injuries due to wearing contact lenses are increasing. In order to correct this discrepancy, information on contact lenses and practical experience with them must be substantially increased from a medical perspective. This review article deals with the most important aspects for prevention of complications, i.e. contact lens hygiene.

  15. Experimental Flat-Field for Correction of XRT Contamination Spots

    Science.gov (United States)

    McKenzie, D. E.; Fox, J. L.; Kankelborg, C.

    2012-08-01

    Beginning in mid-2007, the XRT images are marred by dark spots due to beads of congealed contaminant. While programs are available for improving the cosmetic appearance of the images, no method has yet been demonstrated for a quantitative correction. We have employed a flatfielding method developed for MSU's MOSES sounding rocket payload, in an attempt to restore capabilities for quantitative photometry in the affected pixels. Initial results are encouraging; characterization of the uncertainties in the photometric correction are ongoing. We report on the degree to which this flatfielding attempt has been successful.

  16. Motion correction in thoracic positron emission tomography

    CERN Document Server

    Gigengack, Fabian; Dawood, Mohammad; Schäfers, Klaus P

    2015-01-01

    Respiratory and cardiac motion leads to image degradation in Positron Emission Tomography (PET), which impairs quantification. In this book, the authors present approaches to motion estimation and motion correction in thoracic PET. The approaches for motion estimation are based on dual gating and mass-preserving image registration (VAMPIRE) and mass-preserving optical flow (MPOF). With mass-preservation, image intensity modulations caused by highly non-rigid cardiac motion are accounted for. Within the image registration framework different data terms, different variants of regularization and parametric and non-parametric motion models are examined. Within the optical flow framework, different data terms and further non-quadratic penalization are also discussed. The approaches for motion correction particularly focus on pipelines in dual gated PET. A quantitative evaluation of the proposed approaches is performed on software phantom data with accompanied ground-truth motion information. Further, clinical appl...

  17. Brain Image Motion Correction

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Benjaminsen, Claus; Larsen, Rasmus

    2015-01-01

    The application of motion tracking is wide, including: industrial production lines, motion interaction in gaming, computer-aided surgery and motion correction in medical brain imaging. Several devices for motion tracking exist using a variety of different methodologies. In order to use such devices...... offset and tracking noise in medical brain imaging. The data are generated from a phantom mounted on a rotary stage and have been collected using a Siemens High Resolution Research Tomograph for positron emission tomography. During acquisition the phantom was tracked with our latest tracking prototype...

  18. Investigation into diagnostic accuracy of common strategies for automated perfusion motion correction.

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D; Greenwood, John P; Plein, Sven; Boyle, Roger D; Radjenovic, Aleksandra; Magee, Derek R

    2016-04-01

    Respiratory motion is a significant obstacle to the use of quantitative perfusion in clinical practice. Increasingly complex motion correction algorithms are being developed to correct for respiratory motion. However, the impact of these improvements on the final diagnosis of ischemic heart disease has not been evaluated. The aim of this study was to compare the performance of four automated correction methods in terms of their impact on diagnostic accuracy. Three strategies for motion correction were used: (1) independent translation correction for all slices, (2) translation correction for the basal slice with transform propagation to the remaining two slices assuming identical motion in the remaining slices, and (3) rigid correction (translation and rotation) for the basal slice. There were no significant differences in diagnostic accuracy between the manual and automatic motion-corrected datasets ([Formula: see text]). The area under the curve values for manual motion correction and automatic motion correction were 0.93 and 0.92, respectively. All of the automated motion correction methods achieved a comparable diagnostic accuracy to manual correction. This suggests that the simplest automated motion correction method (method 2 with translation transform for basal location and transform propagation to the remaining slices) is a sufficiently complex motion correction method for use in quantitative myocardial perfusion.

  19. Improving long-term, retrospective precipitation datasets using satellite-based surface soil moisture retrievals and the Soil Moisture Analysis Rainfall Tool

    Science.gov (United States)

    Chen, Fan; Crow, Wade T.; Holmes, Thomas R. H.

    2012-01-01

    Using historical satellite surface soil moisture products, the Soil Moisture Analysis Rainfall Tool (SMART) is applied to improve the submonthly scale accuracy of a multi-decadal global daily rainfall product that has been bias-corrected to match the monthly totals of available rain gauge observations. In order to adapt to the irregular retrieval frequency of heritage soil moisture products, a new variable correction window method is developed that allows for better efficiency in leveraging temporally sparse satellite soil moisture retrievals. Results confirm the advantage of using this variable window method relative to an existing fixed-window version of SMART over a range of one- to 30-day accumulation periods. Using this modified version of SMART and heritage satellite surface soil moisture products, a 1.0-deg, 20-year (1979 to 1998) global rainfall dataset over land is corrected and validated. Relative to the original precipitation product, the corrected dataset demonstrates improved correlation with a global gauge-based daily rainfall product, lower root-mean-square-error (-13%) on a 10-day scale and provides a higher probability of detection (+5%) and lower false alarm rates (-3.4%) for five-day rainfall accumulation estimates. This corrected rainfall dataset is expected to provide improved rainfall forcing data for the land surface modeling community.

  20. SATELLITE BASED LIVE AND INTERACTIVE DISTANCE LEARNING PROGRAM IN THE FIELD OF GEOINFORMATICS – A PERSPECTIVE OF INDIAN INSTITUTE OF REMOTE SENSING, INDIA

    Directory of Open Access Journals (Sweden)

    P. L. N. Raju

    2012-09-01

    Full Text Available Geoinformatics is a highly specialized discipline that deals with Remote Sensing, Geographical Information System (GIS, Global Positioning System (GPS and field surveys for assessing, quantification, development and management of resources, planning and infrastructure development, utility services etc. Indian Institute of Remote Sensing (IIRS, a premier institute and one of its kinds has played a key role for capacity Building in this specialized area since its inception in 1966. Realizing the large demand, IIRS has started outreach program in basics of Remote Sensing, GIS and GPS for universities and institutions. EDUSAT (Educational Satellite is the communication satellite built and launched by ISRO in 2004 exclusively for serving the educational sector to meet the demand for an interactive satellite based distance education system for the country. IIRS has used EDUSAT (shifted to INSAT 4 CR recently due to termination of services from EDUSAT for its distance learning program to impart basic training in Remote Sensing, GIS and GPS, catering to the universities spread across India. The EDUSAT based training is following similar to e-learning method but has advantage of live interaction sessions between teacher and the students when the lecture is delivered using EDUSAT satellite communication. Because of its good quality reception the interactions are not constrained due to bandwidth problems of Internet. National Natural Resource Management System, Department of Space, Government of India, under Standing Committee in Training and Technology funded this unique program to conduct the basic training in Geoinformatics. IIRS conducts 6 weeks basic training course on "Remote Sensing, GIS and GPS" regularly since the year 2007. The course duration is spread over the period of 3 months beginning with the start of the academic year (1st semester i.e., July to December every year, for university students. IIRS has utilized EDUSAT satellite for

  1. Satellite Based Live and Interactive Distance Learning Program in the Field of Geoinformatics - a Perspective of Indian Institute of Remote Sensing, India

    Science.gov (United States)

    Raju, P. L. N.; Gupta, P. K.; Roy, P. S.

    2011-09-01

    Geoinformatics is a highly specialized discipline that deals with Remote Sensing, Geographical Information System (GIS), Global Positioning System (GPS) and field surveys for assessing, quantification, development and management of resources, planning and infrastructure development, utility services etc. Indian Institute of Remote Sensing (IIRS), a premier institute and one of its kinds has played a key role for capacity Building in this specialized area since its inception in 1966. Realizing the large demand, IIRS has started outreach program in basics of Remote Sensing, GIS and GPS for universities and institutions. EDUSAT (Educational Satellite) is the communication satellite built and launched by ISRO in 2004 exclusively for serving the educational sector to meet the demand for an interactive satellite based distance education system for the country. IIRS has used EDUSAT (shifted to INSAT 4 CR recently due to termination of services from EDUSAT) for its distance learning program to impart basic training in Remote Sensing, GIS and GPS, catering to the universities spread across India. The EDUSAT based training is following similar to e-learning method but has advantage of live interaction sessions between teacher and the students when the lecture is delivered using EDUSAT satellite communication. Because of its good quality reception the interactions are not constrained due to bandwidth problems of Internet. National Natural Resource Management System, Department of Space, Government of India, under Standing Committee in Training and Technology funded this unique program to conduct the basic training in Geoinformatics. IIRS conducts 6 weeks basic training course on "Remote Sensing, GIS and GPS" regularly since the year 2007. The course duration is spread over the period of 3 months beginning with the start of the academic year (1st semester) i.e., July to December every year, for university students. IIRS has utilized EDUSAT satellite for conducting 4 six weeks

  2. Understanding Droughts and their Agricultural Impact in North America at the Basin Scale through the Development of Satellite Based Drought Indicators

    Science.gov (United States)

    Munoz Hernandez, A.; Lawford, R. G.

    2012-12-01

    Drought is a major constraint severely affecting numerous agricultural regions in North America. Decision makers need timely information on the existence of a drought as well as its intensity, frequency, likely duration, and economic and social effects in order to implement adaptation strategies and minimize its impacts. Countries like Mexico and Canada face a challenge associated with the lack of consistent and reliable in-situ data that allows the computation of drought indicators at resolutions that effectively supports decision makers at the watershed scale. This study focuses on (1) the development of near-real time drought indicators at high resolution utilizing various satellite data for use in improving adaptation plans and mitigation actions at the basin level; (2) the quantification of the relationships between current and historical droughts and their agricultural impacts by evaluating thresholds for drought impacts; and (3) the assessment of the effects of existing water policies, economic subsidies, and infrastructure that affect the vulnerability of a particular region to the economic impacts of a drought. A pilot study area located in Northwest Mexico and known as the Rio Yaqui Basin was selected for this study in order to make comparisons between the satellite based indicators derived from currently available satellite products to provide an assessment of the quality of the products generated. The Rio Yaqui Basin, also referred to as the "bread basket" of Mexico, is situated in an arid to semi-arid region where highly sophisticated irrigation systems have been implemented to support extensive agriculture. Although for many years the irrigation systems acted as a safety net for the farmers, recent droughts have significantly impacted agricultural output, affected thousands of people, and increase the dependence on groundwater. The drought indices generated are used in conjunction with a decision-support model to provide information on drought impacts

  3. Land-use regression with long-term satellite-based greenness index and culture-specific sources to model PM2.5 spatial-temporal variability.

    Science.gov (United States)

    Wu, Chih-Da; Chen, Yu-Cheng; Pan, Wen-Chi; Zeng, Yu-Ting; Chen, Mu-Jean; Guo, Yue Leon; Lung, Shih-Chun Candice

    2017-05-01

    This study utilized a long-term satellite-based vegetation index, and considered culture-specific emission sources (temples and Chinese restaurants) with Land-use Regression (LUR) modelling to estimate the spatial-temporal variability of PM2.5 using data from Taipei metropolis, which exhibits typical Asian city characteristics. Annual average PM2.5 concentrations from 2006 to 2012 of 17 air quality monitoring stations established by Environmental Protection Administration of Taiwan were used for model development. PM2.5 measurements from 2013 were used for external data verification. Monthly Normalized Difference Vegetation Index (NDVI) images coupled with buffer analysis were used to assess the spatial-temporal variations of greenness surrounding the monitoring sites. The distribution of temples and Chinese restaurants were included to represent the emission contributions from incense and joss money burning, and gas cooking, respectively. Spearman correlation coefficient and stepwise regression were used for LUR model development, and 10-fold cross-validation and external data verification were applied to verify the model reliability. The results showed a strongly negative correlation (r: -0.71 to -0.77) between NDVI and PM2.5 while temples (r: 0.52 to 0.66) and Chinese restaurants (r: 0.31 to 0.44) were positively correlated to PM2.5 concentrations. With the adjusted model R2 of 0.89, a cross-validated adj-R2 of 0.90, and external validated R2 of 0.83, the high explanatory power of the resultant model was confirmed. Moreover, the averaged NDVI within a 1750 m circular buffer (p < 0.01), the number of Chinese restaurants within a 1750 m buffer (p < 0.01), and the number of temples within a 750 m buffer (p = 0.06) were selected as important predictors during the stepwise selection procedures. According to the partial R2, NDVI explained 66% of PM2.5 variation and was the dominant variable in the developed model. We suggest future studies consider these

  4. The Feasibility of Tropospheric and Total Ozone Determination Using a Fabry-perot Interferometer as a Satellite-based Nadir-viewing Atmospheric Sensor. Ph.D. Thesis

    Science.gov (United States)

    Larar, Allen Maurice

    1993-01-01

    Monitoring of the global distribution of tropospheric ozone (O3) is desirable for enhanced scientific understanding as well as to potentially lessen the ill-health impacts associated with exposure to elevated concentrations in the lower atmosphere. Such a capability can be achieved using a satellite-based device making high spectral resolution measurements with high signal-to-noise ratios; this would enable observation in the pressure-broadened wings of strong O3 lines while minimizing the impact of undesirable signal contributions associated with, for example, the terrestrial surface, interfering species, and clouds. The Fabry-Perot Interferometer (FPI) provides high spectral resolution and high throughput capabilities that are essential for this measurement task. Through proper selection of channel spectral regions, the FPI optimized for tropospheric O3 measurements can simultaneously observe a stratospheric component and thus the total O3 column abundance. Decreasing stratospheric O3 concentrations may lead to an increase in biologically harmful solar ultraviolet radiation reaching the earth's surface, which is detrimental to health. In this research, a conceptual instrument design to achieve the desired measurement has been formulated. This involves a double-etalon fixed-gap series configuration FPI along with an ultra-narrow bandpass filter to achieve single-order operation with an overall spectral resolution of approximately .068 cm(exp -1). A spectral region of about 1 cm(exp -1) wide centered at 1054.73 cm(exp -1) within the strong 9.6 micron ozone infrared band is sampled with 24 spectral channels. Other design characteristics include operation from a nadir-viewing satellite configuration utilizing a 9 inch (diameter) telescope and achieving horizontal spatial resolution with a 50 km nadir footprint. A retrieval technique has been implemented and is demonstrated for a tropical atmosphere possessing enhanced tropospheric ozone amounts. An error analysis

  5. Turbulence compressibility corrections

    Science.gov (United States)

    Coakley, T. J.; Horstman, C. C.; Marvin, J. G.; Viegas, J. R.; Bardina, J. E.; Huang, P. G.; Kussoy, M. I.

    1994-01-01

    The basic objective of this research was to identify, develop and recommend turbulence models which could be incorporated into CFD codes used in the design of the National AeroSpace Plane vehicles. To accomplish this goal, a combined effort consisting of experimental and theoretical phases was undertaken. The experimental phase consisted of a literature survey to collect and assess a database of well documented experimental flows, with emphasis on high speed or hypersonic flows, which could be used to validate turbulence models. Since it was anticipated that this database would be incomplete and would need supplementing, additional experiments in the NASA Ames 3.5-Foot Hypersonic Wind Tunnel (HWT) were also undertaken. The theoretical phase consisted of identifying promising turbulence models through applications to simple flows, and then investigating more promising models in applications to complex flows. The complex flows were selected from the database developed in the first phase of the study. For these flows it was anticipated that model performance would not be entirely satisfactory, so that model improvements or corrections would be required. The primary goals of the investigation were essentially achieved. A large database of flows was collected and assessed, a number of additional hypersonic experiments were conducted in the Ames HWT, and two turbulence models (kappa-epsilon and kappa-omega models with corrections) were determined which gave superior performance for most of the flows studied and are now recommended for NASP applications.

  6. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  7. 75 FR 16516 - Dates Correction

    Science.gov (United States)

    2010-04-01

    ... From the Federal Register Online via the Government Publishing Office ] NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Office of the Federal Register Dates Correction Correction In the Notices section... through 15499, the date at the top of each page is corrected to read ``Monday, March 29, 2010''. This...

  8. 77 FR 39899 - Technical Corrections

    Science.gov (United States)

    2012-07-06

    ..., correcting authority citations and typographical and spelling errors, and making other edits and conforming... I office; correcting typographical and spelling errors; and making other edits and conforming...).'' Correct Typographical Error. In Sec. 73.55(i)(4)(ii)(G), the word ``the'' was omitted due to a clerical...

  9. Job Satisfaction in Correctional Officers.

    Science.gov (United States)

    Diehl, Ron J.

    For more than a decade, correctional leaders throughout the country have attempted to come to grips with the basic issues involved in ascertaining and meeting the needs of correctional institutions. This study investigated job satisfaction in 122 correctional officers employed in both rural and urban prison locations for the State of Kansas…

  10. 78 FR 34245 - Miscellaneous Corrections

    Science.gov (United States)

    2013-06-07

    ... office, correcting and adding missing cross-references, correcting grammatical errors, revising language... grammatical errors, revising language for clarity and consistency, and specifying metric units. This document... Correct Grammatical Error. In Sec. 73.6, paragraph (a) is revised to replace the colon at the end of the...

  11. Anomaly corrected heterotic horizons

    Energy Technology Data Exchange (ETDEWEB)

    Fontanella, A.; Gutowski, J.B. [Department of Mathematics, University of Surrey, Guildford, GU2 7XH (United Kingdom); Papadopoulos, G. [Department of Mathematics, King’s College London,Strand, London WC2R 2LS (United Kingdom)

    2016-10-21

    We consider supersymmetric near-horizon geometries in heterotic supergravity up to two loop order in sigma model perturbation theory. We identify the conditions for the horizons to admit enhancement of supersymmetry. We show that solutions which undergo supersymmetry enhancement exhibit an sl(2,ℝ) symmetry, and we describe the geometry of their horizon sections. We also prove a modified Lichnerowicz type theorem, incorporating α{sup ′} corrections, which relates Killing spinors to zero modes of near-horizon Dirac operators. Furthermore, we demonstrate that there are no AdS{sub 2} solutions in heterotic supergravity up to second order in α{sup ′} for which the fields are smooth and the internal space is smooth and compact without boundary. We investigate a class of nearly supersymmetric horizons, for which the gravitino Killing spinor equation is satisfied on the spatial cross sections but not the dilatino one, and present a description of their geometry.

  12. EDITORIAL: Politically correct physics?

    Science.gov (United States)

    Pople Deputy Editor, Stephen

    1997-03-01

    If you were a caring, thinking, liberally minded person in the 1960s, you marched against the bomb, against the Vietnam war, and for civil rights. By the 1980s, your voice was raised about the destruction of the rainforests and the threat to our whole planetary environment. At the same time, you opposed discrimination against any group because of race, sex or sexual orientation. You reasoned that people who spoke or acted in a discriminatory manner should be discriminated against. In other words, you became politically correct. Despite its oft-quoted excesses, the political correctness movement sprang from well-founded concerns about injustices in our society. So, on balance, I am all for it. Or, at least, I was until it started to invade science. Biologists were the first to feel the impact. No longer could they refer to 'higher' and 'lower' orders, or 'primitive' forms of life. To the list of undesirable 'isms' - sexism, racism, ageism - had been added a new one: speciesism. Chemists remained immune to the PC invasion, but what else could you expect from a group of people so steeped in tradition that their principal unit, the mole, requires the use of the thoroughly unreconstructed gram? Now it is the turn of the physicists. This time, the offenders are not those who talk disparagingly about other people or animals, but those who refer to 'forms of energy' and 'heat'. Political correctness has evolved into physical correctness. I was always rather fond of the various forms of energy: potential, kinetic, chemical, electrical, sound and so on. My students might merge heat and internal energy into a single, fuzzy concept loosely associated with moving molecules. They might be a little confused at a whole new crop of energies - hydroelectric, solar, wind, geothermal and tidal - but they could tell me what devices turned chemical energy into electrical energy, even if they couldn't quite appreciate that turning tidal energy into geothermal energy wasn't part of the

  13. Quantitative magnetic measurements with transmission electron microscope

    Energy Technology Data Exchange (ETDEWEB)

    Rusz, Jan, E-mail: jan.rusz@fysik.uu.s [Department of Physics and Materials Science, Uppsala University, Box 530, S-751 21 (Sweden); Lidbaum, Hans [Department of Engineering Sciences, Uppsala University, Box 534, S-751 21 (Sweden); Liebig, Andreas; Hjoervarsson, Bjoergvin; Oppeneer, Peter M. [Department of Physics and Materials Science, Uppsala University, Box 530, S-751 21 (Sweden); Rubino, Stefano [Department of Engineering Sciences, Uppsala University, Box 534, S-751 21 (Sweden); Eriksson, Olle [Department of Physics and Materials Science, Uppsala University, Box 530, S-751 21 (Sweden); Leifer, Klaus [Department of Engineering Sciences, Uppsala University, Box 534, S-751 21 (Sweden)

    2010-05-15

    We briefly review the state-of-the-art electron magnetic chiral dichroism experiments and theory with focus on quantitative measurements of the atom-specific orbital to spin moment ratio m{sub l}/m{sub s}. Our approach of quantitative method, based on reciprocal space mapping of the magnetic signal, is described. We discuss additional symmetry considerations for m{sub l}/m{sub s} measurements, which are present due to dynamical diffraction effects. These lead to a preference for the 3-beam orientation of the sample. Further on, we describe a method of correcting asymmetries present due to imperfect 3-beam orientation-the so-called double-difference correction.

  14. A recovery coefficient method for partial volume correction of PET images

    National Research Council Canada - National Science Library

    Srinivas, Shyam M; Dhurairaj, Thiruvenkatasamy; Basu, Sandip; Bural, Gonca; Surti, Suleman; Alavi, Abass

    2009-01-01

    Correction of the "partial volume effect" has been an area of great interest in the recent times in quantitative PET imaging and has been mainly studied with count recovery models based upon phantoms...

  15. Global Daily High-Resolution Satellite-Based Foundation Sea Surface Temperature Dataset: Development and Validation against Two Definitions of Foundation SST

    OpenAIRE

    Kohtaro Hosoda; Futoki Sakaida

    2016-01-01

    This paper describes a global, daily sea surface temperature (SST) analysis based on satellite microwave and infrared measurements. The SST analysis includes a diurnal correction method to estimate foundation SST (SST free from diurnal variability) using satellite sea surface wind and solar radiation data, frequency splitting to reproduce intra-seasonal variability and a quality control procedure repeated twice to avoid operation errors. An optimal interpolation method designed for foundation...

  16. NON LINEAR OPTIMIZATION APPLIED TO ANGLE-OF-ARRIVAL SATELLITE BASED GEO-LOCALIZATION FOR BIASED AND TIME-DRIFTING SENSORS

    Directory of Open Access Journals (Sweden)

    D. Levy

    2016-06-01

    Full Text Available Multiple sensors are used in a variety of geolocation systems. Many use Time Difference of Arrival (TDOA or Received Signal Strength (RSS measurements to estimate the most likely location of a signal. When an object does not emit an RF signal, Angle of Arrival (AOA measurements using optical or infrared frequencies become more feasible than TDOA or RSS measurements. AOA measurements can be created from any sensor platform with any sort of optical sensor, location and attitude knowledge to track passive objects. Previous work has created a non-linear optimization (NLO method for calculating the most likely estimate from AOA measurements. Two new modifications to the NLO algorithm are created and shown to correct AOA measurement errors by estimating the inherent bias and time-drift in the Inertial Measurement Unit (IMU of the AOA sensing platform. One method corrects the sensor bias in post processing while treating the NLO method as a module. The other method directly corrects the sensor bias within the NLO algorithm by incorporating the bias parameters as a state vector in the estimation process. These two methods are analyzed using various Monte-Carlo simulations to check the general performance of the two modifications in comparison to the original NLO algorithm.

  17. Quantitative dispersion microscopy

    OpenAIRE

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Ramachandra R Dasari; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live...

  18. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  19. Processor register error correction management

    Energy Technology Data Exchange (ETDEWEB)

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  20. Program Derivation by Correctness Enhacements

    Directory of Open Access Journals (Sweden)

    Nafi Diallo

    2016-06-01

    Full Text Available Relative correctness is the property of a program to be more-correct than another program with respect to a given specification. Among the many properties of relative correctness, that which we found most intriguing is the property that program P' refines program P if and only if P' is more-correct than P with respect to any specification. This inspires us to reconsider program derivation by successive refinements: each step of this process mandates that we transform a program P into a program P' that refines P, i.e. P' is more-correct than P with respect to any specification. This raises the question: why should we want to make P' more-correct than P with respect to any specification, when we only have to satisfy specification R? In this paper, we discuss a process of program derivation that replaces traditional sequence of refinement-based correctness-preserving transformations starting from specification R by a sequence of relative correctness-based correctness-enhancing transformations starting from abort.

  1. Peptide Selection for Targeted Protein Quantitation.

    Science.gov (United States)

    Chiva, Cristina; Sabidó, Eduard

    2017-03-03

    Targeted proteomics methods in their different flavors rely on the use of a few peptides as proxies for protein quantitation, which need to be specified either prior to or after data acquisition. However, in contrast with discovery methods that use all identified peptides for a given protein to estimate its abundance, targeted proteomics methods are limited in the number of peptides that are used for protein quantitation. Because only a few peptides per protein are acquired or extracted in targeted experiments, the selection of peptides that are used for targeted protein quantitation becomes crucial. Several rules have been proposed to guide peptide selection for targeted proteomics studies, which have generally been based on the amino acidic composition of the peptide sequences. However, the compliance of these rules does not imply that not-conformed peptides are not reproducibly generated nor do they guarantee that the selected peptides correctly represent the behavior of the protein abundance under different conditions.

  2. Diamagnetic Corrections and Pascal's Constants

    Science.gov (United States)

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  3. Unpacking Corrections in Mobile Instruction

    DEFF Research Database (Denmark)

    Levin, Lena; Cromdal, Jakob; Broth, Mathias

    2017-01-01

    This article deals with the organisation of correction in mobile instructional settings. Five sets of video data (>250 h) documenting how learners were instructed to fly aeroplanes, drive cars and ride bicycles in real life traffic were examined to reveal some common features of correction exchan...

  4. Space mapping and defect correction

    NARCIS (Netherlands)

    Echeverría, D.; Hemker, P.W.

    2005-01-01

    In this paper we show that space-mapping optimization can be understood in the framework of defect correction. Then, space-mapping algorithms can be seen as special cases of defect correction iteration. In order to analyze the properties of space mapping and the space-mapping function, we introduce

  5. Correctional Officers With Case Loads

    Science.gov (United States)

    Ward, Richard J.; Vandergoot, David

    1977-01-01

    The Maryland Division of Correction has implemented an innovative program that permits correctional officers to combine counseling case-management functions with their custory functions. Basically, the program relies on the quality of individual programming that results from the close personal, working relationship developed between officer and…

  6. Food systems in correctional settings

    DEFF Research Database (Denmark)

    Smoyer, Amy; Kjær Minke, Linda

    Food is a central component of life in correctional institutions and plays a critical role in the physical and mental health of incarcerated people and the construction of prisoners' identities and relationships. An understanding of the role of food in correctional settings and the effective mana......, including a case study of food-related innovation in the Danish correctional system. It offers specific conclusions for policy-makers, administrators of correctional institutions and prison-food-service professionals, and makes proposals for future research.......Food is a central component of life in correctional institutions and plays a critical role in the physical and mental health of incarcerated people and the construction of prisoners' identities and relationships. An understanding of the role of food in correctional settings and the effective...... management of food systems may improve outcomes for incarcerated people and help correctional administrators to maximize their health and safety. This report summarizes existing research on food systems in correctional settings and provides examples of food programmes in prison and remand facilities...

  7. author's correction 1..1

    Indian Academy of Sciences (India)

    B Indian Academy of Sciences. Author's correction. Cullin-5 and cullin-2 play a role in the development of neuromuscular junction and the female germ line of Drosophila. Champakali Ayyub. J. Genet. 90, 239Y249. The correct figure 5A is as follows: Journal of Genetics, Vol. 90, No. 3, December 2011. 519.

  8. Jet Energy Corrections at CMS

    CERN Document Server

    Santocchia, Attilio

    2009-01-01

    Many physics measurements in CMS will rely on the precise reconstruction of Jets. Correction of the raw jet energy measured by the CMS detector will be a fundamental step for most of the analysis where hadron activity is investigated. Jet correction plans in CMS have been widely studied for different conditions: at stat-up simulation tuned on test-beam data will be used. Then data-driven methods will be available and finally, simulation tuned on collision data will give us the ultimate procedure for calculating jet corrections. Jet transverse energy is corrected first for pile-up and noise offset; correction for the response of the calorimeter as a function of jet pseudorapidity relative to the barrel comes afterwards and correction for the absolute response as a function of transverse momentum in the barrel is the final standard sub-correction applied. Other effects like flavour and parton correction will be optionally applied on the Jet $E_T$ depending on the measurement requests. In this paper w...

  9. Retrieval of High-Resolution Atmospheric Particulate Matter Concentrations from Satellite-Based Aerosol Optical Thickness over the Pearl River Delta Area, China

    Directory of Open Access Journals (Sweden)

    Lili Li

    2015-06-01

    Full Text Available Satellite remote sensing offers an effective approach to estimate indicators of air quality on a large scale. It is critically significant for air quality monitoring in areas experiencing rapid urbanization and consequently severe air pollution, like the Pearl River Delta (PRD in China. This paper starts with examining ground observations of particulate matter (PM and the relationship between PM10 (particles smaller than 10 μm and aerosol optical thickness (AOT by analyzing observations on the sampling sites in the PRD. A linear regression (R2 = 0.51 is carried out using MODIS-derived 500 m-resolution AOT and PM10 concentration from monitoring stations. Data of atmospheric boundary layer (ABL height and relative humidity are used to make vertical and humidity corrections on AOT. Results after correction show higher correlations (R2 = 0.55 between extinction coefficient and PM10. However, coarse spatial resolution of meteorological data affects the smoothness of retrieved maps, which suggests high-resolution and accurate meteorological data are critical to increase retrieval accuracy of PM. Finally, the model provides the spatial distribution maps of instantaneous and yearly average PM10 over the PRD. It is proved that observed PM10 is more relevant to yearly mean AOT than instantaneous values.

  10. Global Daily High-Resolution Satellite-Based Foundation Sea Surface Temperature Dataset: Development and Validation against Two Definitions of Foundation SST

    Directory of Open Access Journals (Sweden)

    Kohtaro Hosoda

    2016-11-01

    Full Text Available This paper describes a global, daily sea surface temperature (SST analysis based on satellite microwave and infrared measurements. The SST analysis includes a diurnal correction method to estimate foundation SST (SST free from diurnal variability using satellite sea surface wind and solar radiation data, frequency splitting to reproduce intra-seasonal variability and a quality control procedure repeated twice to avoid operation errors. An optimal interpolation method designed for foundation SST is applied to blend the microwave and infrared satellite measurements. Although in situ SST measurements are not used for bias correction adjustments in the analysis, the output product, with a spatial grid size of 0.1°, has an accuracy of 0.48 ∘ C and 0.46 ∘ C compared to the in situ foundation SST measurements derived by drifting buoys and Argo floats, respectively. The same quality against the two types of in situ foundation SST (drifters and Argo suggests that the two definitions of foundation SST proposed by past studies can provide same-quality information about the sea surface state underlying the diurnal thermocline.

  11. Regional Attenuation Correction of Weather Radar Using a Distributed Microwave-Links Network

    Directory of Open Access Journals (Sweden)

    Yang Xue

    2017-01-01

    Full Text Available The complex temporal-spatial variation of raindrop size distribution will affect the precision of precipitation quantitative estimates (QPE produced from radar data, making it difficult to correct echo attenuation. Given the fact that microwave links can obtain the total path attenuation accurately, we introduce the concept of regional attenuation correction using a multiple-microwave-links network based on the tomographic reconstruction of attenuation coefficients. Derived from the radar-based equation, the effect of rainfall distribution on the propagation of radar and microwave link signals was analyzed. This article focuses on modeling of the tomographic reconstruction of attenuation coefficients and regional attenuation correction algorithms. Finally, a numerical simulation of regional attenuation correction was performed to verify the algorithms employed here. The results demonstrate that the correction coefficient (0.9175 falls between the corrected and initial field of radar reflectivity factor (root mean square error, 2.3476 dBz; average deviation, 0.0113 dBz. Compared with uncorrected data, the accuracy of the corrected radar reflectivity factor was improved by 26.12%, and the corrected rainfall intensity distribution was improved by 51.85% validating the region attenuation correction algorithm. This method can correct the regional attenuation of weather radar echo effectively and efficiently; it can be widely used for the radar attenuation correction and the promotion of quantitative precipitation estimation by weather radar.

  12. Cool Cluster Correctly Correlated

    Energy Technology Data Exchange (ETDEWEB)

    Varganov, Sergey Aleksandrovich [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    Atomic clusters are unique objects, which occupy an intermediate position between atoms and condensed matter systems. For a long time it was thought that physical and chemical properties of atomic dusters monotonically change with increasing size of the cluster from a single atom to a condensed matter system. However, recently it has become clear that many properties of atomic clusters can change drastically with the size of the clusters. Because physical and chemical properties of clusters can be adjusted simply by changing the cluster's size, different applications of atomic clusters were proposed. One example is the catalytic activity of clusters of specific sizes in different chemical reactions. Another example is a potential application of atomic clusters in microelectronics, where their band gaps can be adjusted by simply changing cluster sizes. In recent years significant advances in experimental techniques allow one to synthesize and study atomic clusters of specified sizes. However, the interpretation of the results is often difficult. The theoretical methods are frequently used to help in interpretation of complex experimental data. Most of the theoretical approaches have been based on empirical or semiempirical methods. These methods allow one to study large and small dusters using the same approximations. However, since empirical and semiempirical methods rely on simple models with many parameters, it is often difficult to estimate the quantitative and even qualitative accuracy of the results. On the other hand, because of significant advances in quantum chemical methods and computer capabilities, it is now possible to do high quality ab-initio calculations not only on systems of few atoms but on clusters of practical interest as well. In addition to accurate results for specific clusters, such methods can be used for benchmarking of different empirical and semiempirical approaches. The atomic clusters studied in this work contain from a few atoms

  13. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  14. Quantitative cardiac ultrasound

    NARCIS (Netherlands)

    H. Rijsterborgh (Hans)

    1990-01-01

    textabstractThis thesis is about the various aspects of quantitative cardiac ultrasound. The first four chapters are mainly devoted to the reproducibility of echocardiographic measurements. These . are focussed on the variation of echocardiographic measurements within patients. An important

  15. On Quantitative Rorschach Scales.

    Science.gov (United States)

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  16. Quantitative physics tasks

    OpenAIRE

    Snětinová, Marie

    2015-01-01

    Title: Quantitative Physics Tasks Author: Mgr. Marie Snětinová Department: Department of Physics Education Supervisor of the doctoral thesis: doc. RNDr. Leoš Dvořák, CSc., Department of Physics Education Abstract: The doctoral thesis concerns with problem solving in physics, especially on students' attitudes to solving of quantitative physics tasks, and various methods how to develop students' problem solving skills in physics. It contains brief overview of the theoretical framework of proble...

  17. Libertarian Anarchism Is Apodictically Correct

    OpenAIRE

    Redford, James

    2011-01-01

    James Redford, "Libertarian Anarchism Is Apodictically Correct", Social Science Research Network (SSRN), Dec. 15, 2011, 9 pp., doi:10.2139/ssrn.1972733. ABSTRACT: It is shown that libertarian anarchism (i.e., consistent liberalism) is unavoidably true.

  18. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  19. Beam Trajectory Correction for SNS

    CERN Document Server

    Chu, Chungming

    2005-01-01

    Automated beam trajectory correction with dipole correctors is developed and tested during the Spallation Neutron Source warm linac commissioning periods. The application is based on the XAL Java framework with newly developed optimization tools. Also, dipole corrector polarities and strengths, and beam position monitor (BPM) polarities were checked by an orbit difference program. The on-line model is used in both the trajectory correction and the orbit difference applications. Experimental data for both applications will be presented.

  20. Comparative evaluation of scatter correction techniques in 3D positron emission tomography

    CERN Document Server

    Zaidi, H

    2000-01-01

    Much research and development has been concentrated on the scatter compensation required for quantitative 3D PET. Increasingly sophisticated scatter correction procedures are under investigation, particularly those based on accurate scatter models, and iterative reconstruction-based scatter compensation approaches. The main difference among the correction methods is the way in which the scatter component in the selected energy window is estimated. Monte Carlo methods give further insight and might in themselves offer a possible correction procedure. Methods: Five scatter correction methods are compared in this paper where applicable. The dual-energy window (DEW) technique, the convolution-subtraction (CVS) method, two variants of the Monte Carlo-based scatter correction technique (MCBSC1 and MCBSC2) and our newly developed statistical reconstruction-based scatter correction (SRBSC) method. These scatter correction techniques are evaluated using Monte Carlo simulation studies, experimental phantom measurements...

  1. Quantitative mass spectrometry methods for pharmaceutical analysis.

    Science.gov (United States)

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  2. Quantum error correction for beginners

    Science.gov (United States)

    Devitt, Simon J.; Munro, William J.; Nemoto, Kae

    2013-07-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future.

  3. Mitigating Satellite-Based Fire Sampling Limitations in Deriving Biomass Burning Emission Rates: Application to WRF-Chem Model Over the Northern sub-Saharan African Region

    Science.gov (United States)

    Wang, Jun; Yue, Yun; Wang, Yi; Ichoku, Charles; Ellison, Luke; Zeng, Jing

    2018-01-01

    Largely used in several independent estimates of fire emissions, fire products based on MODIS sensors aboard the Terra and Aqua polar-orbiting satellites have a number of inherent limitations, including (a) inability to detect fires below clouds, (b) significant decrease of detection sensitivity at the edge of scan where pixel sizes are much larger than at nadir, and (c) gaps between adjacent swaths in tropical regions. To remedy these limitations, an empirical method is developed here and applied to correct fire emission estimates based on MODIS pixel level fire radiative power measurements and emission coefficients from the Fire Energetics and Emissions Research (FEER) biomass burning emission inventory. The analysis was performed for January 2010 over the northern sub-Saharan African region. Simulations from WRF-Chem model using original and adjusted emissions are compared with the aerosol optical depth (AOD) products from MODIS and AERONET as well as aerosol vertical profile from CALIOP data. The comparison confirmed an 30-50% improvement in the model simulation performance (in terms of correlation, bias, and spatial pattern of AOD with respect to observations) by the adjusted emissions that not only increases the original emission amount by a factor of two but also results in the spatially continuous estimates of instantaneous fire emissions at daily time scales. Such improvement cannot be achieved by simply scaling the original emission across the study domain. Even with this improvement, a factor of two underestimations still exists in the modeled AOD, which is within the current global fire emissions uncertainty envelope.

  4. Semi-Physical Estimates of National-Scale PM10 Concentrations in China Using a Satellite-Based Geographically Weighted Regression Model

    Directory of Open Access Journals (Sweden)

    Tianhao Zhang

    2016-06-01

    Full Text Available The estimation of ambient particulate matter with diameter less than 10 µm (PM10 at high spatial resolution is currently quite limited in China. In order to make the distribution of PM10 more accessible to relevant departments and scientific research institutions, a semi-physical geographically weighted regression (GWR model was established in this study to estimate nationwide mass concentrations of PM10 using easily available MODIS AOD and NCEP Reanalysis meteorological parameters. The results demonstrated that applying physics-based corrections could remarkably improve the quality of the dataset for better model performance with the adjusted R2 between PM10 and AOD increasing from 0.08 to 0.43, and the fitted results explained approximately 81% of the variability in the corresponding PM10 mass concentrations. Annual average PM10 concentrations estimated by the semi-physical GWR model indicated that many residential regions suffer from severe particle pollution. Moreover, the deviation in estimation, which primarily results from the frequent changes in elevation, the spatially heterogeneous distribution of monitoring sites, and the limitations of AOD retrieval algorithm, was acceptable. Therefore, the semi-physical GWR model provides us with an effective and efficient method to estimate PM10 at large scale. The results could offer reasonable estimations of health impacts and provide guidance on emission control strategies in China.

  5. Quantitative wood anatomy - practical guidelines

    Directory of Open Access Journals (Sweden)

    Georg evon Arx

    2016-06-01

    Full Text Available Quantitative wood anatomy analyzes the variability of xylem anatomical features in trees, shrubs and herbaceous species to address research questions related to plant functioning, growth and environment. Among the more frequently considered anatomical features are lumen dimensions and wall thickness of conducting cells, fibers and several ray properties. The structural properties of each xylem anatomical feature are mostly fixed once they are formed, and define to a large extent its functionality, including transport and storage of water, nutrients, sugars and hormones, and providing mechanical support. The anatomical features can often be localized within an annual growth ring, which allows to establish intra-annual past and present structure-function relationships and its sensitivity to environmental variability. However, there are many methodological obstacles to overcome when aiming at producing (large data sets of xylem anatomical data.Here we describe the different steps from wood sample collection to xylem anatomical data, provide guidance and identify pitfalls, and present different image-analysis tools for the quantification of anatomical features, in particular conducting cells. We show that each data production step from sample collection in the field, microslide preparation in the lab, image capturing through an optical microscope and image analysis with specific tools can readily introduce measurement errors between 5 to 30% and more, whereby the magnitude usually increases the smaller the anatomical features. Such measurement errors – if not avoided or corrected – may make it impossible to extract meaningful xylem anatomical data in light of the rather small range of variability in many anatomical features as observed, for example, within time series of individual plants. Following a rigid protocol and quality control as proposed in this paper is thus mandatory to use quantitative data of xylem anatomical features as a powerful

  6. Mitral Valve Repair: The French Correction Versus the American Correction.

    Science.gov (United States)

    Schubert, Sarah A; Mehaffey, James H; Charles, Eric J; Kron, Irving L

    2017-08-01

    Degenerative mitral valve disease causing mitral regurgitation is the most common organic valve pathology and is classified based on leaflet motion. The "French correction" mitral valve repair method restores normal valvular anatomy with extensive leaflet resection, chordal manipulation, and rigid annuloplasty. The American correction attempts to restore normal valve function through minimal leaflet resection, flexible annuloplasty, and use of artificial chordae. These differing methods of mitral valve repair reflect an evolution in principles, but both require understanding of the valve pathology and correction of leaflet prolapse and annular dilatation. Adhering to those unifying principles and ensuring that no patient leaves the operating room with significant persistent mitral regurgitation produces durable results and satisfactory patient outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Fully 3D refraction correction dosimetry system.

    Science.gov (United States)

    Manjappa, Rakesh; Makki, S Sharath; Kumar, Rajesh; Vasu, Ram Mohan; Kanhirodan, Rajan

    2016-02-21

    medium is 71.8%, an increase of 6.4% compared to that achieved using conventional ART algorithm. Smaller diameter dosimeters are scanned with dry air scanning by using a wide-angle lens that collects refracted light. The images reconstructed using cone beam geometry is seen to deteriorate in some planes as those regions are not scanned. Refraction correction is important and needs to be taken in to consideration to achieve quantitatively accurate dose reconstructions. Refraction modeling is crucial in array based scanners as it is not possible to identify refracted rays in the sinogram space.

  8. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    L/stroke (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently......Duplex Vector Flow Imaging (VFI) imaging is introduced as a replacement for spectral Doppler, as it automatically can yield fully quantitative flow estimates without angle correction. Continuous VFI data over 9 s for 10 pulse cycles were acquired by a 3 MHz convex probe connected to the SARUS...... scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...

  9. Evaluation and bias correction of satellite rainfall data for drought monitoring in Indonesia

    Directory of Open Access Journals (Sweden)

    R. R. E. Vernimmen

    2012-01-01

    Full Text Available The accuracy of three satellite rainfall products (TMPA 3B42RT, CMORPH and PERSIANN was investigated through comparison with grid cell average ground station rainfall data in Indonesia, with a focus on their ability to detect patterns of low rainfall that may lead to drought conditions. Each of the three products underestimated rainfall in dry season months. The CMORPH and PERSIANN data differed most from ground station data and were also very different from the TMPA 3B42RT data. It proved possible to improve TMPA 3B42RT estimates by applying a single empirical bias correction equation that was uniform in space and time. For the six regions investigated, this reduced the root mean square error for estimates of dry season rainfall totals by a mean 9% (from 44 to 40 mm and for annual totals by 14% (from 77 to 66 mm. The resulting errors represent 10% and 3% of mean dry season and annual rainfall, respectively. The accuracy of these bias corrected TMPA 3B42RT data is considered adequate for use in real-time drought monitoring in Indonesia. Compared to drought monitoring with only ground stations, this use of satellite-based rainfall estimates offers important advantages in terms of accuracy, spatial coverage, timeliness and cost efficiency.

  10. Benthic Habitat Mapping Using Multispectral High-Resolution Imagery: Evaluation of Shallow Water Atmospheric Correction Techniques

    Directory of Open Access Journals (Sweden)

    Francisco Eugenio

    2017-11-01

    Full Text Available Remote multispectral data can provide valuable information for monitoring coastal water ecosystems. Specifically, high-resolution satellite-based imaging systems, as WorldView-2 (WV-2, can generate information at spatial scales needed to implement conservation actions for protected littoral zones. However, coastal water-leaving radiance arriving at the space-based sensor is often small as compared to reflected radiance. In this work, complex approaches, which usually use an accurate radiative transfer code to correct the atmospheric effects, such as FLAASH, ATCOR and 6S, have been implemented for high-resolution imagery. They have been assessed in real scenarios using field spectroradiometer data. In this context, the three approaches have achieved excellent results and a slightly superior performance of 6S model-based algorithm has been observed. Finally, for the mapping of benthic habitats in shallow-waters marine protected environments, a relevant application of the proposed atmospheric correction combined with an automatic deglinting procedure is presented. This approach is based on the integration of a linear mixing model of benthic classes within the radiative transfer model of the water. The complete methodology has been applied to selected ecosystems in the Canary Islands (Spain but the obtained results allow the robust mapping of the spatial distribution and density of seagrass in coastal waters and the analysis of multitemporal variations related to the human activity and climate change in littoral zones.

  11. Benthic Habitat Mapping Using Multispectral High-Resolution Imagery: Evaluation of Shallow Water Atmospheric Correction Techniques.

    Science.gov (United States)

    Eugenio, Francisco; Marcello, Javier; Martin, Javier; Rodríguez-Esparragón, Dionisio

    2017-11-16

    Remote multispectral data can provide valuable information for monitoring coastal water ecosystems. Specifically, high-resolution satellite-based imaging systems, as WorldView-2 (WV-2), can generate information at spatial scales needed to implement conservation actions for protected littoral zones. However, coastal water-leaving radiance arriving at the space-based sensor is often small as compared to reflected radiance. In this work, complex approaches, which usually use an accurate radiative transfer code to correct the atmospheric effects, such as FLAASH, ATCOR and 6S, have been implemented for high-resolution imagery. They have been assessed in real scenarios using field spectroradiometer data. In this context, the three approaches have achieved excellent results and a slightly superior performance of 6S model-based algorithm has been observed. Finally, for the mapping of benthic habitats in shallow-waters marine protected environments, a relevant application of the proposed atmospheric correction combined with an automatic deglinting procedure is presented. This approach is based on the integration of a linear mixing model of benthic classes within the radiative transfer model of the water. The complete methodology has been applied to selected ecosystems in the Canary Islands (Spain) but the obtained results allow the robust mapping of the spatial distribution and density of seagrass in coastal waters and the analysis of multitemporal variations related to the human activity and climate change in littoral zones.

  12. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  13. Correction

    Directory of Open Access Journals (Sweden)

    Laergaard Simon

    1996-11-01

    Full Text Available In the paper Muhlenbergia cleefii sp.nov., a new grass from the high Andes of Colombia, Caldasia 17(82-85: 409-412. 1995, an orthographic error was unfortunately introduced and the epithet was speIled "cleefi" (except at one place in the general text.

  14. Correction.

    Science.gov (United States)

    2015-03-01

    In the January 2015 issue of Cyberpsychology, Behavior, and Social Networking (vol. 18, no. 1, pp. 3–7), the article "Individual Differences in Cyber Security Behaviors: An Examination of Who Is Sharing Passwords." by Prof. Monica Whitty et al., has an error in wording in the abstract. The sentence in question was originally printed as: Contrary to our hypotheses, we found older people and individuals who score high on self-monitoring were more likely to share passwords. It should read: Contrary to our hypotheses, we found younger people and individuals who score high on self-monitoring were more likely to share passwords. The authors wish to apologize for the error.

  15. Correction

    Science.gov (United States)

    1999-11-01

    Synsedimentary deformation in the Jurassic of southeastern Utah—A case of impact shaking? COMMENT Geology, v. 27, p. 661 (July 1999) The sentence on p. 661, first column, second paragraph, line one, should read: The 1600 m of Pennsylvania Paradox Formation is 75 90% salt in Arches National Park. The sentence on p. 661, second column, third paragraph, line seven, should read: This high-pressured ydrothermal solution created the clastic dikes, chert nodules from reprecipitated siliceous cement that have been called “siliceous impactites” (Kriens et al., 1997), and much of the present structure at Upheaval Dome by further faulting.

  16. Correction

    CERN Document Server

    2007-01-01

    From left to right: Luis, Carmen, Mario, Christian and José listening to speeches by theorists Alvaro De Rújula and Luis Alvarez-Gaumé (right) at their farewell gathering on 15 May.We unfortunately cut out a part of the "Word of thanks" from the team retiring from Restaurant No. 1. The complete message is published below: Dear friends, You are the true "nucleus" of CERN. Every member of this extraordinary human mosaic will always remain in our affections and in our thoughts. We have all been very touched by your spontaneous generosity. Arrivederci, Mario Au revoir,Christian Hasta Siempre Carmen, José and Luis PS: Lots of love to the theory team and to the hidden organisers. So long!

  17. String-Corrected Black Holes

    Energy Technology Data Exchange (ETDEWEB)

    Hubeny, V.

    2005-01-12

    We investigate the geometry of four dimensional black hole solutions in the presence of stringy higher curvature corrections to the low energy effective action. For certain supersymmetric two charge black holes these corrections drastically alter the causal structure of the solution, converting seemingly pathological null singularities into timelike singularities hidden behind a finite area horizon. We establish, analytically and numerically, that the string-corrected two-charge black hole metric has the same Penrose diagram as the extremal four-charge black hole. The higher derivative terms lead to another dramatic effect--the gravitational force exerted by a black hole on an inertial observer is no longer purely attractive. The magnitude of this effect is related to the size of the compactification manifold.

  18. Should diastasis recti be corrected?

    Science.gov (United States)

    Nahas, F X; Augusto, S M; Ghelfond, C

    1997-01-01

    The plication of the anterior rectus sheath is a procedure that is performed by most surgeons during abdominoplasty. A main concern is whether the correction of recti diastasis is really effective and if it is stable. In order to verify the position of the rectus muscle, a CT-scan was used in 14 patients who underwent abdominoplasty with rectus plication to compare the preoperative situation of these muscles with their position 3 weeks and 6 months postoperatively. None of these patients had had previous abdominal surgery. The recti diastasis was corrected with a two-layer 2-0 Nylon suture. A dynamometer was used to measure the resistance force of the anterior aponeurosis of the rectus. In all cases the CT data shows that correction of the diastasis was achieved completely after 6 months.

  19. Delegation in Correctional Nursing Practice.

    Science.gov (United States)

    Tompkins, Frances

    2016-07-01

    Correctional nurses face daily challenges as a result of their work environment. Common challenges include availability of resources for appropriate care delivery, negotiating with custody staff for access to patients, adherence to scope of practice standards, and working with a varied staffing mix. Professional correctional nurses must consider the educational backgrounds and competency of other nurses and assistive personnel in planning for care delivery. Budgetary constraints and varied staff preparation can be a challenge for the professional nurse. Adequate care planning requires understanding the educational level and competency of licensed and unlicensed staff. Delegation is the process of assessing patient needs and transferring responsibility for care to appropriately educated and competent staff. Correctional nurses can benefit from increased knowledge about delegation. © The Author(s) 2016.

  20. "LEPTOP" electroweak corrections at LEP1

    CERN Document Server

    Novikov, V; Vysotsky, M I

    1995-01-01

    This work discusses parameters of the electroweak Lagrangian, coupling constants, the alpha Born approximation, W mass, Z mass, Z decays, one-loop electroweak corrections, electroweak corrections for the "gluon-free" observables, gluonic corrections to electroweak loops. (12 refs).

  1. Corrections in Montana: A Consultation on Corrections in Montana.

    Science.gov (United States)

    Montana State Advisory Committee to the U.S. Commission on Civil Rights, Helena.

    The findings and recommendations of a two-day conference on the civil and human rights of inmates of Montana's correctional institutions are contained in this report. The views of private citizens and experts from local, state, and federal organizations are presented in edited form under seven subject headings: existing prison reform legislation,…

  2. Contact lens correction of presbyopia.

    Science.gov (United States)

    Morgan, Philip B; Efron, Nathan

    2009-08-01

    The ageing population highlights the need to provide effective optical solutions for presbyopic contact lens wearers. However, data gathered from annual contact lens fitting surveys demonstrate that fewer than 40% of contact lens wearers over 45 years of age (virtually all of whom can be presumed to suffer a partial or complete loss of accommodation) are prescribed a presbyopic correction. Furthermore, monovision is prescribed as frequently as multifocal lenses. These observations suggest that an optimal solution to the contact lens correction of presbyopia remains elusive.

  3. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  4. Quantitative Decision Making.

    Science.gov (United States)

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  5. Quantitative Management in Libraries

    Science.gov (United States)

    Heinritz, Fred J.

    1970-01-01

    Based on a position paper orginally presented at the Institute on Quantitative Methods in Librarianship at Ohio State University Libraries in August, 1969, this discusses some of the elements of management: motion, time and cost studies, operations research and other mathematical techniques, and data processing equipment. (Author)

  6. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  7. Corrective measures evaluation report for technical area-v groundwater.

    Energy Technology Data Exchange (ETDEWEB)

    Witt, Johnathan L (North Wind, Inc., Idaho Falls, ID); Orr, Brennon R. (North Wind, Inc., Idaho Falls, ID); Dettmers, Dana L. (North Wind, Inc., Idaho Falls, ID); Hall, Kevin A. (North Wind, Inc., Idaho Falls, ID); Howard, Hope (North Wind, Inc., Idaho Falls, ID)

    2005-07-01

    This Corrective Measures Evaluation Report was prepared as directed by the Compliance Order on Consent issued by the New Mexico Environment Department to document the process of selecting the preferred remedial alternative for contaminated groundwater at Technical Area V. Supporting information includes background information about the site conditions and potential receptors and an overview of work performed during the Corrective Measures Evaluation. Evaluation of remedial alternatives included identification and description of four remedial alternatives, an overview of the evaluation criteria and approach, qualitative and quantitative evaluation of remedial alternatives, and selection of the preferred remedial alternative. As a result of the Corrective Measures Evaluation, it was determined that monitored natural attenuation of all contaminants of concern (trichloroethene, tetrachloroethene, and nitrate) was the preferred remedial alternative for implementation as the corrective measure to remediate contaminated groundwater at Technical Area V of Sandia National Laboratories/New Mexico. Finally, design criteria to meet cleanup goals and objectives and the corrective measures implementation schedule for the preferred remedial alternative are presented.

  8. Pixels and patterns: A satellite-based investigation of changes to urban features in the Sanya Region, Hainan Special Economic Zone, China

    Science.gov (United States)

    Millward, Andrew Allan

    Throughout most of China, and particularly in the coastal areas of its south, ecological resources and traditional culture are viewed by many to be negatively impacted by accelerating urbanization. As a result, achieving an appropriate balance between development and environmental protection has become a significant problem facing policy-makers in these urbanizing areas. The establishment of a Special Economic Zone in the Chinese Province of Hainan has made its coastal areas attractive locations for business and commerce. Development activities that support a burgeoning tourism industry, but which are damaging the environment, are now prominent components of the landscape in the Sanya Region of Hainan. In this study, patterns of urban growth in the Sanya Region of Hainan Province are investigated. Specifically, using several forms of satellite imagery, statistical tools and ancillary data, urban morphology and changes to the extent and spatial arrangement of urban features are researched and documented. A twelve-year chronology of data was collected which consists of four dates of satellite imagery (1987, 1991, 1997, 1999) acquired by three different satellite sensors (SPOT 2 HRV, Landsat 5 TM, Landsat 7 ETM+). A method of assessing inter-temporal variance in unchanged features is developed as a surrogate for traditional evaluations of change detection that require spatially accurate and time-specific data. Results reveal that selective PCA using visible bands with the exclusion of an ocean mask yield the most interpretable components representative of landscape urbanization in the Sanya Region. The geostatistical approach of variography is employed to measure spatial dependence and to test for the presence of directional change in urban morphology across a time series of satellite images. Interpreted time-series geostatistics identify and quantify landscape structure, and changes to structure, and provide a valuable quantitative description of landscape change

  9. Space Mapping and Defect Correction

    NARCIS (Netherlands)

    D. Echeverria (David); D.J.P. Lahaye (Domenico); P.W. Hemker (Piet); W.H.A. Schilders (Wil); H.A. van der Vorst (Henk); J. Rommes

    2008-01-01

    textabstractIn this chapter we present the principles of the space-mapping iteration techniques for the efficient solution of optimization problems. We also show how space-mapping optimization can be understood in the framework of defect correction. We observe the difference between the solution

  10. Space mapping and defect correction

    NARCIS (Netherlands)

    Echeverría, D.; Lahaye, D.; Hemker, P.W.; Schilders, W.H.A.; van der Vorst, H.A.; Rommes, J.

    2008-01-01

    In this chapter we present the principles of the space-mapping iteration techniques for the efficient solution of optimization problems. We also show how space-mapping optimization can be understood in the framework of defect correction. We observe the difference between the solution of the

  11. Political Correctness and American Academe.

    Science.gov (United States)

    Drucker, Peter F.

    1994-01-01

    Argues that today's political correctness atmosphere is a throwback to attempts made by the Nazis and Stalinists to force society into conformity. Academia, it is claimed, is being forced to conform to gain control of the institution of higher education. It is predicted that this effort will fail. (GR)

  12. author's correction 1..1

    Indian Academy of Sciences (India)

    Maria Angelica Ehara Watanabe, Sandra Odebrechet Vargas Nunes, Marla Karine Amarante, Roberta Losi Guembarovski,. Julie Massayo Maeda Oda, Kalil William Alves De Lima and Maria Helena Pelegrinelli Fungaro. J. Genet. 89, 179Y185. The correct spelling for coauthor 'Sandra Odebrechet Vargas Nunes' is 'Sandra ...

  13. Practical Reasoning in Corrections Education.

    Science.gov (United States)

    LaBar, C.; And Others

    1983-01-01

    The article explains the six tasks involved in practical reasoning and describes a research project that centered around teaching a six-week course in critical thinking to inmates at a medium security prison in an attempt to determine the feasibility of implementing a moral education program in correctional institutions. (SB)

  14. Quantum Correction of Fluctuation Theorem

    OpenAIRE

    Monnai, T.; Tasaki, S.

    2003-01-01

    Quantum analogues of the transient fluctuation theorem(TFT) and steady-state fluctuation theorem(SSFT) are investigated for a harmonic oscillator linearly coupled with a harmonic reservoir. The probability distribution for the work done externally is derived and quantum correction for TFT and SSFT are calculated.

  15. New Dimensions in Correctional Education.

    Science.gov (United States)

    Saam, Robert

    The Penitentiary of New Mexico (PNM) is currently offering the course, Report Writing for Officers, to teach correctional officers how to write better reports. This course focuses primarily on misconduct reports and supporting memos and touches upon the interdependence the course creates between the areas of treatment and security at PNM, helping…

  16. 75 FR 70951 - Notice, Correction

    Science.gov (United States)

    2010-11-19

    ... DISABILITY (NCD) Sunshine Act Meetings Notice, Correction Type: Quarterly Meeting. Summary: NCD published a.... FOR FURTHER INFORMATION CONTACT: Mark Quigley, Director of Communications, NCD, 1331 F Street, NW... Commission 9:30 a.m.-11:30 a.m. Continuation of NCD Open Meeting 11:30 a.m. Adjournment Dated: November 17...

  17. "Free Speech" and "Political Correctness"

    Science.gov (United States)

    Scott, Peter

    2016-01-01

    "Free speech" and "political correctness" are best seen not as opposing principles, but as part of a spectrum. Rather than attempting to establish some absolute principles, this essay identifies four trends that impact on this debate: (1) there are, and always have been, legitimate debates about the--absolute--beneficence of…

  18. Design correctness of digital systems

    NARCIS (Netherlands)

    Huijs, C.

    1998-01-01

    Transformational design is a formal technique directed at design correctness. It integrates design and verification by the use of pre-proven behaviour preserving transformations as design steps. A formal framework is necessary but hidden for the designer. Five formal aspects are integrated in the

  19. CORRECTIVE ACTION IN CAR MANUFACTURING

    Directory of Open Access Journals (Sweden)

    H. Rohne

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this paper the important .issues involved in successfully implementing corrective action systems in quality management are discussed. The work is based on experience in implementing and operating such a system in an automotive manufacturing enterprise in South Africa. The core of a corrective action system is good documentation, supported by a computerised information system. Secondly, a systematic problem solving methodology is essential to resolve the quality related problems identified by the system. In the following paragraphs the general corrective action process is discussed and the elements of a corrective action system are identified, followed by a more detailed discussion of each element. Finally specific results from the application are discussed.

    AFRIKAANSE OPSOMMING: Belangrike oorwegings by die suksesvolle implementering van korrektiewe aksie stelsels in gehaltebestuur word in hierdie artikel bespreek. Die werk is gebaseer op ondervinding in die implementering en bedryf van so 'n stelsel by 'n motorvervaardiger in Suid Afrika. Die kern van 'n korrektiewe aksie stelsel is goeie dokumentering, gesteun deur 'n gerekenariseerde inligtingstelsel. Tweedens is 'n sistematiese probleemoplossings rnetodologie nodig om die gehalte verwante probleme wat die stelsel identifiseer aan te spreek. In die volgende paragrawe word die algemene korrektiewe aksie proses bespreek en die elemente van die korrektiewe aksie stelsel geidentifiseer. Elke element word dan in meer besonderhede bespreek. Ten slotte word spesifieke resultate van die toepassing kortliks behandel.

  20. [Analysis of tongue color under natural daylight based on chromatic aberration correction].

    Science.gov (United States)

    Xu, Jia-tuo; Zhang, Zhi-feng; Yan, Zhu-juan; Tu, Li-ping; Lu, Lu-ming; Shi, Mei-yu; Zhu, Feng-lan

    2009-05-01

    To establish an analytical method for tongue image acquisition under natural daylight based on L*a*b* error correction, and to observe the classification rules of tongue color using color error correction. The tongue images in 413 cases were collected under natural indoor daylight by using Nikon D70 digital SLR camera, and then the color error was adjusted by using Nikon Capture NX software correction according to Kodak Q-13 grey card. The classification and quantitative analysis of the tongue color after software correction was carried out depending on L*a*b* color space. The software correction method had good effects in adjusting the tongue color image error. The L* values of light red, deep red and cyanosis tongues decreased as compared with that of light white tongue (PL*a*b* error correction is accurate in color restoration and feasible to operate.

  1. Energy & Climate: Getting Quantitative

    Science.gov (United States)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  2. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  3. Absolute quantitation of protein posttranslational modification isoform.

    Science.gov (United States)

    Yang, Zhu; Li, Ning

    2015-01-01

    Mass spectrometry has been widely applied in characterization and quantification of proteins from complex biological samples. Because the numbers of absolute amounts of proteins are needed in construction of mathematical models for molecular systems of various biological phenotypes and phenomena, a number of quantitative proteomic methods have been adopted to measure absolute quantities of proteins using mass spectrometry. The liquid chromatography-tandem mass spectrometry (LC-MS/MS) coupled with internal peptide standards, i.e., the stable isotope-coded peptide dilution series, which was originated from the field of analytical chemistry, becomes a widely applied method in absolute quantitative proteomics research. This approach provides more and more absolute protein quantitation results of high confidence. As quantitative study of posttranslational modification (PTM) that modulates the biological activity of proteins is crucial for biological science and each isoform may contribute a unique biological function, degradation, and/or subcellular location, the absolute quantitation of protein PTM isoforms has become more relevant to its biological significance. In order to obtain the absolute cellular amount of a PTM isoform of a protein accurately, impacts of protein fractionation, protein enrichment, and proteolytic digestion yield should be taken into consideration and those effects before differentially stable isotope-coded PTM peptide standards are spiked into sample peptides have to be corrected. Assisted with stable isotope-labeled peptide standards, the absolute quantitation of isoforms of posttranslationally modified protein (AQUIP) method takes all these factors into account and determines the absolute amount of a protein PTM isoform from the absolute amount of the protein of interest and the PTM occupancy at the site of the protein. The absolute amount of the protein of interest is inferred by quantifying both the absolute amounts of a few PTM

  4. Peningkatan Kemampuan Petugas Pemasyarakatan Dalam Menangulangi Peredaran Narkoba Di Lembaga Pemasyarakatan Dan Rumah Tahanan Negara (Development of Correctional Officer Competencies in Overcoming Drugs Trafficking at Correctional Institution and Detention Center)

    OpenAIRE

    Apriansyah, Nizar

    2016-01-01

    Problems in correctional institutions sometimes, become a bad highlight by media such as a drug haunt.This research tries to examine a factual data related to drugs trafficking in correctional institution anddetention centers. It attempts to find out a pattern of education and training that able to be implemented toeducate correctional officers, so that in the future, can be taken steps to anticipate it. It uses quantitative andqualitative approach. Based on discussion, can be concluded that ...

  5. Correcting ligands, metabolites, and pathways

    Directory of Open Access Journals (Sweden)

    Vriend Gert

    2006-11-01

    Full Text Available Abstract Background A wide range of research areas in bioinformatics, molecular biology and medicinal chemistry require precise chemical structure information about molecules and reactions, e.g. drug design, ligand docking, metabolic network reconstruction, and systems biology. Most available databases, however, treat chemical structures more as illustrations than as a datafield in its own right. Lack of chemical accuracy impedes progress in the areas mentioned above. We present a database of metabolites called BioMeta that augments the existing pathway databases by explicitly assessing the validity, correctness, and completeness of chemical structure and reaction information. Description The main bulk of the data in BioMeta were obtained from the KEGG Ligand database. We developed a tool for chemical structure validation which assesses the chemical validity and stereochemical completeness of a molecule description. The validation tool was used to examine the compounds in BioMeta, showing that a relatively small number of compounds had an incorrect constitution (connectivity only, not considering stereochemistry and that a considerable number (about one third had incomplete or even incorrect stereochemistry. We made a large effort to correct the errors and to complete the structural descriptions. A total of 1468 structures were corrected and/or completed. We also established the reaction balance of the reactions in BioMeta and corrected 55% of the unbalanced (stoichiometrically incorrect reactions in an automatic procedure. The BioMeta database was implemented in PostgreSQL and provided with a web-based interface. Conclusion We demonstrate that the validation of metabolite structures and reactions is a feasible and worthwhile undertaking, and that the validation results can be used to trigger corrections and improvements to BioMeta, our metabolite database. BioMeta provides some tools for rational drug design, reaction searches, and

  6. Correction of MS data for naturally occurring isotopes in isotope labelling experiments.

    Science.gov (United States)

    Millard, Pierre; Letisse, Fabien; Sokol, Serguei; Portais, Jean-Charles

    2014-01-01

    Mass spectrometry (MS) in combination with isotope labelling experiments is widely used for investigations of metabolism and other biological processes. Quantitative applications-e.g., (13)C metabolic flux analysis-require correction of raw MS data (isotopic clusters) for the contribution of all naturally abundant isotopes. This chapter describes how to perform such correction using the software IsoCor. This flexible, user-friendly software can be used to exploit any isotopic tracer, from well-known ((13)C, (15)N, (18)O, etc.) to unusual ((57)Fe, (77)Se, etc.) isotopes. It also provides options-e.g., correction for the isotopic purity of the tracer-to improve the accuracy of quantitative isotopic studies, and allows automated correction of large datasets that can be collected with modern MS methods.

  7. Correctional Staff Training Institutes. Final Report.

    Science.gov (United States)

    Southern Illinois Univ., East St. Louis, Center for the Study of Crime, Delinquency and Corrections.

    Three national institutes for correctional staff trainers incorporated new techniques in an attempt to upgrade corrections programs through improved staff development. There were 78 trainer and 200 middle management staff and correctional officers involved in the program, representing more than 100 correctional institutions in the United States.…

  8. Attenuation correction for the HRRT PET-scanner using transmission scatter correction and total variation regularization.

    Science.gov (United States)

    Keller, Sune H; Svarer, Claus; Sibomana, Merence

    2013-09-01

    In the standard software for the Siemens high-resolution research tomograph (HRRT) positron emission tomography (PET) scanner the most commonly used segmentation in the μ -map reconstruction for human brain scans is maximum a posteriori for transmission (MAP-TR). Bias in the lower cerebellum and pons in HRRT brain images have been reported. The two main sources of the problem with MAP-TR are poor bone/soft tissue segmentation below the brain and overestimation of bone mass in the skull. We developed the new transmission processing with total variation (TXTV) method that introduces scatter correction in the μ-map reconstruction and total variation filtering to the transmission processing. Comparing MAP-TR and the new TXTV with gold standard CT-based attenuation correction, we found that TXTV has less bias as compared to MAP-TR. We also compared images acquired at the HRRT scanner using TXTV to the GE Advance scanner images and found high quantitative correspondence. TXTV has been used to reconstruct more than 4000 HRRT scans at seven different sites with no reports of biases. TXTV-based reconstruction is recommended for human brain scans on the HRRT.

  9. Interaction and Self-Correction

    Directory of Open Access Journals (Sweden)

    Glenda Lucila Satne

    2014-07-01

    Full Text Available In this paper I address the question of how to account for the normative dimension involved in conceptual competence in a naturalistic framework. First, I present what I call the Naturalist Challenge (NC, referring to both the phylogenetic and ontogenetic dimensions of conceptual possession and acquisition. I then criticize two models that have been dominant in thinking about conceptual competence, the interpretationist and the causalist models. Both fail to meet NC, by failing to account for the abilities involved in conceptual self-correction. I then offer an alternative account of self-correction that I develop with the help of the interactionist theory of mutual understanding arising from recent developments in Phenomenology and Developmental Psychology.

  10. Regional Attenuation Correction of Weather Radar Using a Distributed Microwave-Links Network

    OpenAIRE

    Yang Xue; Xi-chuan Liu; Tai-chang Gao; Chang-ye Yang; Kun Song

    2017-01-01

    The complex temporal-spatial variation of raindrop size distribution will affect the precision of precipitation quantitative estimates (QPE) produced from radar data, making it difficult to correct echo attenuation. Given the fact that microwave links can obtain the total path attenuation accurately, we introduce the concept of regional attenuation correction using a multiple-microwave-links network based on the tomographic reconstruction of attenuation coefficients. Derived from the radar-ba...

  11. EPS Young Physicist Prize - CORRECTION

    CERN Multimedia

    2009-01-01

    The original text for the article 'Prizes aplenty in Krakow' in Bulletin 30-31 assigned the award of the EPS HEPP Young Physicist Prize to Maurizio Pierini. In fact he shared the prize with Niki Saoulidou of Fermilab, who was rewarded for her contribution to neutrino physics, as the article now correctly indicates. We apologise for not having named Niki Saoulidou in the original article.

  12. Corrective camouflage in pediatric dermatology.

    Science.gov (United States)

    Tedeschi, Aurora; Dall'Oglio, Federica; Micali, Giuseppe; Schwartz, Robert A; Janniger, Camila K

    2007-02-01

    Many dermatologic diseases, including vitiligo and other pigmentary disorders, vascular malformations, acne, and disfiguring scars from surgery or trauma, can be distressing to pediatric patients and can cause psychological alterations such as depression, loss of self-esteem, deterioration of quality of life, emotional distress, and, in some cases, body dysmorphic disorder. Corrective camouflage can help cover cutaneous unaesthetic disorders using a variety of water-resistant and light to very opaque products that provide effective and natural coverage. These products also can serve as concealers during medical treatment or after surgical procedures before healing is complete. Between May 2001 and July 2003. corrective camouflage was used on 15 children and adolescents (age range, 7-16 years; mean age, 14 years). The majority of patients were girls. Six patients had acne vulgaris; 4 had vitiligo; 2 had Becker nevus; and 1 each had striae distensae, allergic contact dermatitis. and postsurgical scarring. Parents of all patients were satisfied with the cosmetic cover results. We consider corrective makeup to be a well-received and valid adjunctive therapy for use during traditional long-term treatment and as a therapeutic alternative in patients in whom conventional therapy is ineffective.

  13. An overview of correctional psychiatry.

    Science.gov (United States)

    Metzner, Jeffrey; Dvoskin, Joel

    2006-09-01

    Supermax facilities may be an unfortunate and unpleasant necessity in modern corrections. Because of the serious dangers posed by prison gangs, they are unlikely to disappear completely from the correctional landscape any time soon. But such units should be carefully reserved for those inmates who pose the most serious danger to the prison environment. Further, the constitutional duty to provide medical and mental health care does not end at the supermax door. There is a great deal of common ground between the opponents of such environments and those who view them as a necessity. No one should want these expensive beds to be used for people who could be more therapeutically and safely managed in mental health treatment environments. No one should want people with serious mental illnesses to be punished for their symptoms. Finally, no one wants these units to make people more, instead of less, dangerous. It is in everyone's interests to learn as much as possible about the potential of these units for good and for harm. Corrections is a profession, and professions base their practices on data. If we are to avoid the most egregious and harmful effects of supermax confinement, we need to understand them far better than we currently do. Though there is a role for advocacy from those supporting or opposed to such environments, there is also a need for objective, scientifically rigorous study of these units and the people who live there.

  14. Adaptive correction of ensemble forecasts

    Science.gov (United States)

    Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane

    2017-04-01

    Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO

  15. Sensitivity of quantitative groundwater recharge estimates to volumetric and distribution uncertainty in rainfall forcing products

    Science.gov (United States)

    Werner, Micha; Westerhoff, Rogier; Moore, Catherine

    2017-04-01

    constructed using the same base data and forced with the VCSN precipitation dataset. Results of the comparison of the rainfall products show that there are significant differences in precipitation volume between the forcing products; in the order of 20% at most points. Even more significant differences can be seen, however, in the distribution of precipitation. For the VCSN data wet days (defined as >0.1mm precipitation) occur on some 20-30% of days (depending on location). This is reasonably reflected in the TRMM and CHIRPS data, while for the re-analysis based products some 60%to 80% of days are wet, albeit at lower intensities. These differences are amplified in the recharge estimates. At most points, volumetric differences are in the order of 40-60%, though difference may range into several orders of magnitude. The frequency distributions of recharge also differ significantly, with recharge over 0.1 mm occurring on 4-6% of days for the VCNS, CHIRPS, and TRMM datasets, but up to the order of 12% of days for the re-analysis data. Comparison against the lysimeter data show estimates to be reasonable, in particular for the reference datasets. Surprisingly some estimates of the lower resolution re-analysis datasets are reasonable, though this does seem to be due to lower recharge being compensated by recharge occurring more frequently. These results underline the importance of correct representation of rainfall volumes, as well as of distribution, particularly when evaluating possible changes to for example changes in precipitation intensity and volume. This holds for precipitation data derived from satellite based and re-analysis products, but also for interpolated data from gauges, where the distribution of intensities is strongly influenced by the interpolation process.

  16. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  17. SELF-CORRECTION OF TEACHERS’ PROFESSIONAL BURNOUT

    Directory of Open Access Journals (Sweden)

    T. F. Orekhova

    2017-01-01

    Full Text Available Introduction. A teacher as a representative of many professions, whose occupation belongs to “person-person” professional system, inevitably undergoes psychological changes which can have negative effect on professional activity due to long performance of labour duties. In this regard, it is necessary to find out the effective ways and technologies designed to provide optimum preservation of own personality and health of teachers. Prophylaxis and correction of professional deformation of teachers is one of the most important directions of modern pedagogical science and practice; this means such a direction contributes for the development of recreational methods and ways of resistance to professional deformation.Aim. The article aims to systematize theoretical material and practical experience in the problem of professional deformation self-correction among presentday teachers in educational institutions; to show the possibilities of self-correction of mental, psychological and physical state of teachers.Methodology and research methods. The methodology of the research is based on system, personal-oriented and activity approaches. The content analysis has become the first and leading method at the stage of collecting statistical material. The content analysis is presented as a complex of the formalized observations and statistical procedures that enable to transfer massive text information to quantitative indicators; on the basis of those indicators it is possible to draw conclu sions about high-quality and latent content of various hand-written or printed documents received during surveys, discussions and interviewing of teachers. The significant data selection is represented by the results of 5 year long investigative work among 220 participants of professional development courses of theInstitute ofCPE “Horizon” atNosovMagnitogorskStateTechnicalUniversity (Russia and 54 participants training inAcademy ofBesancon (France. The processing

  18. Bias correction for magnetic resonance images via joint entropy regularization.

    Science.gov (United States)

    Wang, Shanshan; Xia, Yong; Dong, Pei; Luo, Jianhua; Huang, Qiu; Feng, Dagan; Li, Yuanxiang

    2014-01-01

    Due to the imperfections of the radio frequency (RF) coil or object-dependent electrodynamic interactions, magnetic resonance (MR) images often suffer from a smooth and biologically meaningless bias field, which causes severe troubles for subsequent processing and quantitative analysis. To effectively restore the original signal, this paper simultaneously exploits the spatial and gradient features of the corrupted MR images for bias correction via the joint entropy regularization. With both isotropic and anisotropic total variation (TV) considered, two nonparametric bias correction algorithms have been proposed, namely IsoTVBiasC and AniTVBiasC. These two methods have been applied to simulated images under various noise levels and bias field corruption and also tested on real MR data. The test results show that the proposed two methods can effectively remove the bias field and also present comparable performance compared to the state-of-the-art methods.

  19. Multi-material linearization beam hardening correction for computed tomography.

    Science.gov (United States)

    Lifton, J J

    2017-03-03

    Since beam hardening causes cupping and streaking artifacts in computed tomographic images, the presence of such artifacts can impair both qualitative and quantitative analysis of the reconstructed data. When the scanned object is composed of a single material, it is possible to correct beam hardening artifacts using the linearization method. However, for multi-material objects, an iterative segmentation-based correction algorithm is needed, which is not only computationally expensive, but may also fail if the initial segmentation result is poor. In this study, a new multi-material linearization beam hardening correction method was proposed and evaluated. The new method is fast and implemented in the same manner as a mono-material linearization. The correction takes approximately 0.02 seconds per projection. Although facing a potential disadvantage of requiring attenuation measurements of one of the object's constituent materials, applying the new method has demonstrated its capability for a multi-material workpiece with substantial reduction in both cupping and streaking artifacts. For example, the study showed that the absolute cupping artefacts in steel, titanium and aluminum spheres were reduced from 22%, 20% and 20% to 5%, 1% and 0%, respectively.

  20. Aberration correction for time-domain ultrasound diffraction tomography

    Science.gov (United States)

    Mast, T. Douglas

    2002-07-01

    Extensions of a time-domain diffraction tomography method, which reconstructs spatially dependent sound speed variations from far-field time-domain acoustic scattering measurements, are presented and analyzed. The resulting reconstructions are quantitative images with applications including ultrasonic mammography, and can also be considered candidate solutions to the time-domain inverse scattering problem. Here, the linearized time-domain inverse scattering problem is shown to have no general solution for finite signal bandwidth. However, an approximate solution to the linearized problem is constructed using a simple delay-and-sum method analogous to "gold standard" ultrasonic beamforming. The form of this solution suggests that the full nonlinear inverse scattering problem can be approximated by applying appropriate angle- and space-dependent time shifts to the time-domain scattering data; this analogy leads to a general approach to aberration correction. Two related methods for aberration correction are presented: one in which delays are computed from estimates of the medium using an efficient straight-ray approximation, and one in which delays are applied directly to a time-dependent linearized reconstruction. Numerical results indicate that these correction methods achieve substantial quality improvements for imaging of large scatterers. The parametric range of applicability for the time-domain diffraction tomography method is increased by about a factor of 2 by aberration correction. copyright 2002 Acoustical Society of America.

  1. Dead pixel correction techniques for dual-band infrared imagery

    Science.gov (United States)

    Nguyen, Chuong T.; Mould, Nick; Regens, James L.

    2015-07-01

    We present two new dead pixel correction algorithms for dual-band infrared imagery. Specifically, we address the problem of repairing unresponsive elements in the sensor array using signal processing techniques to overcome deficiencies in image quality that are present following the nonuniformity correction process. Traditionally, dead pixel correction has been performed almost exclusively using variations of the nearest neighbor technique, where the value of the dead pixel is estimated based on pixel values associated with the neighboring image structure. Our approach differs from existing techniques, for the first time we estimate the values of dead pixels using information from both thermal bands collaboratively. The proposed dual-band statistical lookup (DSL) and dual-band inpainting (DIP) algorithms use intensity and local gradient information to estimate the values of dead pixels based on the values of unaffected pixels in the supplementary infrared band. The DSL algorithm is a regression technique that uses the image intensities from the reference band to estimate the dead pixel values in the band undergoing correction. The DIP algorithm is an energy minimization technique that uses the local image gradient from the reference band and the boundary values from the affected band to estimate the dead pixel values. We evaluate the effectiveness of the proposed algorithms with 50 dual-band videos. Simulation results indicate that the proposed techniques achieve perceptually and quantitatively superior results compared to existing methods.

  2. Attenuation Correction for MR Coils in Combined PET/MR Imaging: A Review

    Science.gov (United States)

    Eldib, Mootaz; Bini, Jason; Faul, David D.; Oesingmann, Niels; Tsoumpas, Charalampos; Fayad, Zahi A.

    2015-01-01

    Synopsis With the introduction of clinical PET/MR systems, novel attenuation correction methods are needed, as there are no direct or indirect MR methods to measure the attenuation of the objects in the FOV. A unique challenge for PET/MR attenuation correction is that coils for MR data acquisition are located in the FOV of the PET detector and could induce significant quantitative errors. In this review, we summarize and evaluate current methods and techniques to correct for the attenuation of a variety of coils. PMID:26952728

  3. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    Science.gov (United States)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz

  4. Comparative tests of isospin-symmetry-breaking corrections to superallowed 0+→0+ nuclear β decay

    Science.gov (United States)

    Towner, I. S.; Hardy, J. C.

    2010-12-01

    We present a test with which to evaluate the calculated isospin-symmetry-breaking corrections to superallowed 0+→0+ nuclear β decay. The test is based on the corrected experimental Ft values being required to satisfy conservation of the vector current (CVC). When applied to six sets of published calculations, the test demonstrates quantitatively that only one set, the one based on the shell model with Saxon-Woods radial wave functions, provides satisfactory agreement with CVC. This test can easily be applied to any sets of calculated correction terms that are produced in future.

  5. Correction of gene expression data

    DEFF Research Database (Denmark)

    Darbani Shirvanehdeh, Behrooz; Stewart, C. Neal, Jr.; Noeparvar, Shahin

    2014-01-01

    This report investigates for the first time the potential inter-treatment bias source of cell number for gene expression studies. Cell-number bias can affect gene expression analysis when comparing samples with unequal total cellular RNA content or with different RNA extraction efficiencies....... For maximal reliability of analysis, therefore, comparisons should be performed at the cellular level. This could be accomplished using an appropriate correction method that can detect and remove the inter-treatment bias for cell-number. Based on inter-treatment variations of reference genes, we introduce...

  6. Correct Linearization of Einstein's Equations

    Directory of Open Access Journals (Sweden)

    Rabounski D.

    2006-06-01

    Full Text Available Regularly Einstein's equations can be reduced to a wave form (linearly dependent from the second derivatives of the space metric in the absence of gravitation, the space rotation and Christoffel's symbols. As shown here, the origin of the problem is that one uses the general covariant theory of measurement. Here the wave form of Einstein's equations is obtained in the terms of Zelmanov's chronometric invariants (physically observable projections on the observer's time line and spatial section. The obtained equations depend on solely the second derivatives even if gravitation, the space rotation and Christoffel's symbols. The correct linearization proves: the Einstein equations are completely compatible with weak waves of the metric.

  7. Correction of Motion Artifacts for Real-Time Structured Light

    DEFF Research Database (Denmark)

    Wilm, Jakob; Olesen, Oline Vinter; Paulsen, Rasmus Reinhold

    2015-01-01

    While the problem of motion is often mentioned in conjunction with structured light imaging, few solutions have thus far been proposed. A method is demonstrated to correct for object or camera motion during structured light 3D scene acquisition. The method is based on the combination of a suitable...... pattern strategy with fast phase correlation image registration. The effectiveness of this approach is demonstrated on motion corrupted data of a real-time structured light system, and it is shown that it improves the quality of surface reconstructions visually and quantitively....

  8. ROLANDIC EPILEPSY OF CHILDHOOD: CORRECTION OF COGNITIVE DYSFUNCTIONS

    Directory of Open Access Journals (Sweden)

    S.V. Balkanskaya

    2008-01-01

    Full Text Available The paper is dedicated to the problem of cognitive dysfunctions in pediatric patients with rolandic epilepsy (benign partial epilepsy of childhood. Russian nootropic drug pantoham (hopantenic acid was used for correction of cognitive deficit. Quantitative data obtained via psihomat testing computer system were utilised for verification of principle cognitive functions before and after treatment with hopantenic acid in school age patients undergoing basic antiepileptic therapy. Positive effect of the drug on studied cognitive functions' indices was demonstrated.Key words: children, epilepsy, cognitive functions, hopantenic acid.

  9. PET motion correction using MR-derived motion parameters

    Energy Technology Data Exchange (ETDEWEB)

    Bickell, Matthew [Department of Nuclear Medicine, Medical Imaging Research Center, KU Leuven (Belgium); Koesters, Thomas; Boada, Fernando [Center for Advanced Imaging Innovation and Research, New York University (United States); Department of Radiology, NYU School of Medicine, New York, Bernard & Irene Schwartz Center for Biomedical Imaging, New York (United States); Nuyts, Johan [Department of Nuclear Medicine, Medical Imaging Research Center, KU Leuven (Belgium)

    2014-07-29

    With the improving resolution of modern PET scanners, any slight motion during the scan can cause significant blurring and loss of resolution. MRI scanners have the capacity to perform quick successive scans and thus provide a means to track motion during a scan. Hence, with the advent of simultaneous PET-MR scanners, it has become possible to use the MR scanner to track the motion and thereby provide the necessary motion parameters to correct the PET data. Using a suitable segmentation approach a separate MR scan can provide the attenuation map to produce quantitative PET images.

  10. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lasaygues, Philippe [Laboratoire de Mecanique et d' Acoustique UPR CNRS 7051, 31 chemin Joseph Aiguier, 13402 Marseille Cedex 20 (France); Ouedraogo, Edgard [Laboratoire d' Imagerie Parametrique UMR CNRS 7623, 15, rue de l' Ecole de medecine, 75006 Paris (France); Lefebvre, Jean-Pierre [Laboratoire de Mecanique et d' Acoustique UPR CNRS 7051, 31 chemin Joseph Aiguier, 13402 Marseille Cedex 20 (France); Gindre, Marcel [Laboratoire d' Imagerie Parametrique UMR CNRS 7623, 15, rue de l' Ecole de medecine, 75006 Paris (France); Talmant, Marilyne [Laboratoire d' Imagerie Parametrique UMR CNRS 7623, 15, rue de l' Ecole de medecine, 75006 Paris (France); Laugier, Pascal [Laboratoire d' Imagerie Parametrique UMR CNRS 7623, 15, rue de l' Ecole de medecine, 75006 Paris (France)

    2005-06-07

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol.

  11. [The application of vector analysis for evaluation of astigmatism correction in the corneal refractive surgery].

    Science.gov (United States)

    Zhang, Jiamei; Wang, Yan

    2016-01-01

    Since sixty percent of ametropes obtain astigmatism, which has influence on the visual quality, correcting the astigmatism is always the focus of concerns during visual correction procedures especially for the corneal refractive surgery. The postoperative spherical equivalent or residual cylindrical dioptors was used as quantitative index to evaluate the correction of astigmatism previously; however, such results neglect the effect of astigmatic axis shift on the treatment. Taking astigmatism as a vector parameter could describe the magnitude and direction of astigmatism accurately, thus it was increasingly applied in the evaluation of astigmatism correction. This paper reviews the present vector analysis methods, evaluation indexes and its application for the correction of astigmatism in the corneal refractive surgery.

  12. Determining and correcting "moment bias" in gradient polymer elution chromatography.

    Science.gov (United States)

    Striegel, André M

    2003-05-09

    Gradient polymer elution chromatography (GPEC) is rapidly becoming the analytical method of choice for determining the chemical composition distribution (CCD) of synthetic polymers. GPEC can be performed in traditional (strict precipitation-redissolution mechanism) or interactive (normal- and reversed-phase) modes, and results may be qualitative, semi-quantitative, or fully quantitative. Quantitative approaches have thus far relied on colligative or end group techniques for determining the values of standards used in constructing the GPEC calibration curve. While the values obtained from said methods are number-averages, they are assigned to the peak apexes of the standards (i.e. assigned as peak averages). This creates a determinate error in the quantitation, referred to herein as "moment bias". In this paper we determine moment bias for a series of styrene-acrylonitrile (SAN) copolymers, where the distribution and averages of the AN% have been measured using normal-phase (NP) GPEC. We also correct for the effect via statistical treatment of the chromatographic data.

  13. Self-Correction and the Monitor: Percent of Errors Corrected of Those Attempted vs. Percent Corrected of All Errors Made.

    Science.gov (United States)

    Krashen, Stephen D.

    1994-01-01

    Green and Hecht's (1992, 1993) data are consistent with the Monitor hypothesis, and their findings match Krashen's (1982) report: self-correction (SC) has only a modest overall effect. Their subjects' high accuracy of attempted corrections could be the result of subjects' limiting SC to easily correctable items. (Contains 10 references.) (Author)

  14. Corrections to holographic entanglement plateau

    Science.gov (United States)

    Chen, Bin; Li, Zhibin; Zhang, Jia-ju

    2017-09-01

    We investigate the robustness of the Araki-Lieb inequality in a two-dimensional (2D) conformal field theory (CFT) on torus. The inequality requires that Δ S = S( L) - | S( L - ℓ) - S( ℓ)| is nonnegative, where S( L) is the thermal entropy and S( L - ℓ), S( ℓ) are the entanglement entropies. Holographically there is an entanglement plateau in the BTZ black hole background, which means that there exists a critical length such that when ℓ ≤ ℓ c the inequality saturates Δ S =0. In thermal AdS background, the holographic entanglement entropy leads to Δ S = 0 for arbitrary ℓ. We compute the next-to-leading order contributions to Δ S in the large central charge CFT at both high and low temperatures. In both cases we show that Δ S is strictly positive except for ℓ = 0 or ℓ = L. This turns out to be true for any 2D CFT. In calculating the single interval entanglement entropy in a thermal state, we develop new techniques to simplify the computation. At a high temperature, we ignore the finite size correction such that the problem is related to the entanglement entropy of double intervals on a complex plane. As a result, we show that the leading contribution from a primary module takes a universal form. At a low temperature, we show that the leading thermal correction to the entanglement entropy from a primary module does not take a universal form, depending on the details of the theory.

  15. A satellite-based global landslide model

    Directory of Open Access Journals (Sweden)

    A. Farahmand

    2013-05-01

    Full Text Available Landslides are devastating phenomena that cause huge damage around the world. This paper presents a quasi-global landslide model derived using satellite precipitation data, land-use land cover maps, and 250 m topography information. This suggested landslide model is based on the Support Vector Machines (SVM, a machine learning algorithm. The National Aeronautics and Space Administration (NASA Goddard Space Flight Center (GSFC landslide inventory data is used as observations and reference data. In all, 70% of the data are used for model development and training, whereas 30% are used for validation and verification. The results of 100 random subsamples of available landslide observations revealed that the suggested landslide model can predict historical landslides reliably. The average error of 100 iterations of landslide prediction is estimated to be approximately 7%, while approximately 2% false landslide events are observed.

  16. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  17. Practical estimate of gradient nonlinearity for implementation of apparent diffusion coefficient bias correction.

    Science.gov (United States)

    Malkyarenko, Dariya I; Chenevert, Thomas L

    2014-12-01

    To describe an efficient procedure to empirically characterize gradient nonlinearity and correct for the corresponding apparent diffusion coefficient (ADC) bias on a clinical magnetic resonance imaging (MRI) scanner. Spatial nonlinearity scalars for individual gradient coils along superior and right directions were estimated via diffusion measurements of an isotropicic e-water phantom. Digital nonlinearity model from an independent scanner, described in the literature, was rescaled by system-specific scalars to approximate 3D bias correction maps. Correction efficacy was assessed by comparison to unbiased ADC values measured at isocenter. Empirically estimated nonlinearity scalars were confirmed by geometric distortion measurements of a regular grid phantom. The applied nonlinearity correction for arbitrarily oriented diffusion gradients reduced ADC bias from 20% down to 2% at clinically relevant offsets both for isotropic and anisotropic media. Identical performance was achieved using either corrected diffusion-weighted imaging (DWI) intensities or corrected b-values for each direction in brain and ice-water. Direction-average trace image correction was adequate only for isotropic medium. Empiric scalar adjustment of an independent gradient nonlinearity model adequately described DWI bias for a clinical scanner. Observed efficiency of implemented ADC bias correction quantitatively agreed with previous theoretical predictions and numerical simulations. The described procedure provides an independent benchmark for nonlinearity bias correction of clinical MRI scanners.

  18. Quantitative Hyperspectral Reflectance Imaging

    Directory of Open Access Journals (Sweden)

    Ted A.G. Steemers

    2008-09-01

    Full Text Available Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared. By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms.

  19. 78 FR 76193 - Special Notice; Correction

    Science.gov (United States)

    2013-12-16

    ... AFFAIRS Special Notice; Correction AGENCY: National Cemetery Administration, Department of Veterans... Collection Activity: (The Presidential Memorial Certificate) Proposed Collection; Comment Request On page 69176, under Title paragraph, please correct to read: Presidential Memorial Certificate. On page 69176...

  20. Personality Patterns Among Correctional Officer Applicants

    Science.gov (United States)

    Holland, Terrill R.; And Others

    1976-01-01

    The MMPI profiles of 359 correctional officer applicants were cluster analyzed, which resulted in the identification of five relatively homogeneous subgroups. The implications of the findings for occupationally adaptive and maladaptive correctional officer behavior were discussed. (Editor)

  1. Effective Correctional Treatment: Bibliotherapy for Cynics.

    Science.gov (United States)

    Gendreau, Paul; Ross, Bob

    1979-01-01

    Presents recent evidence, obtained from a review of the literature on correctional treatment published since 1973, appealing the verdict that correctional rehabilitation is ineffective. There are several types of intervention programs that have proved successful with offender populations. (Author)

  2. Survey of Radar Refraction Error Corrections

    Science.gov (United States)

    2016-11-01

    ELECTRONIC TRAJECTORY MEASUREMENTS GROUP RCC 266-16 SURVEY OF RADAR REFRACTION ERROR CORRECTIONS DISTRIBUTION A: Approved for...DOCUMENT 266-16 SURVEY OF RADAR REFRACTION ERROR CORRECTIONS November 2016 Prepared by Electronic...This page intentionally left blank. Survey of Radar Refraction Error Corrections, RCC 266-16 iii Table of Contents Preface

  3. Parsing Schemata and Correctness of Parsing Algorithms

    NARCIS (Netherlands)

    Sikkel, Nicolaas

    1998-01-01

    Parsing schemata give a high-level formal description of parsers. These can be used, among others, as an intermediate level of abstraction for deriving the formal correctness of a parser. A parser is correct if it duly implements a parsing schema that is known to be correct. We discuss how the

  4. Motion correction strategies for integrated PET/MR.

    Science.gov (United States)

    Fürst, Sebastian; Grimm, Robert; Hong, Inki; Souvatzoglou, Michael; Casey, Michael E; Schwaiger, Markus; Nekolla, Stephan G; Ziegler, Sibylle I

    2015-02-01

    Integrated whole-body PET/MR facilitates the implementation of a broad variety of respiratory motion correction strategies, taking advantage of the strengths of both modalities. The goal of this study was the quantitative evaluation with clinical data of different MR- and PET-data-based motion correction strategies for integrated PET/MR. The PET and MR data of 20 patients were simultaneously acquired for 10 min on an integrated PET/MR system after administration of (18)F-FDG or (68)Ga-DOTANOC. Respiratory traces recorded with a bellows were compared against MR self-gating signals and signals extracted from PET raw data with the sensitivity method, by applying principal component analysis (PCA) or Laplacian eigenmaps and by using a novel variation combining the former and either of the latter two. Gated sinograms and MR images were generated accordingly, followed by image registration to derive MR motion models. Corrected PET images were reconstructed by incorporating this information into the reconstruction. An optical flow algorithm was applied for PET-based motion correction. Gating and motion correction were evaluated by quantitative analysis of apparent tracer uptake, lesion volume, displacement, contrast, and signal-to-noise ratio. The correlation between bellows- and MR-based signals was 0.63 ± 0.19, and that between MR and the sensitivity method was 0.52 ± 0.26. Depending on the PET raw-data compression, the average correlation between MR and PCA ranged from 0.25 ± 0.30 to 0.58 ± 0.33, and the range was 0.25 ± 0.30 to 0.42 ± 0.34 if Laplacian eigenmaps were applied. By combining the sensitivity method and PCA or Laplacian eigenmaps, the maximum average correlation to MR could be increased to 0.74 ± 0.21 and 0.70 ± 0.19, respectively. The selection of the best PET-based signal for each patient yielded an average correlation of 0.80 ± 0.13 with MR. Using the best PET-based respiratory signal for gating, mean tracer uptake increased by 17 ± 19% for

  5. Shading correction and calibration in bacterial fluorescence measurement by image processing system

    NARCIS (Netherlands)

    Wilkinson, M.H.F.

    An image processing system with applications in bacterial (immuno-)fluorescence measurement has been developed. To reach quantitative results, correction for non-uniformities in system sensitivity, both as a function of time (calibration for drifts) and as a function of image coordinates (shading

  6. Radiative corrections in bumblebee electrodynamics

    Directory of Open Access Journals (Sweden)

    R.V. Maluf

    2015-10-01

    Full Text Available We investigate some quantum features of the bumblebee electrodynamics in flat spacetimes. The bumblebee field is a vector field that leads to a spontaneous Lorentz symmetry breaking. For a smooth quadratic potential, the massless excitation (Nambu–Goldstone boson can be identified as the photon, transversal to the vacuum expectation value of the bumblebee field. Besides, there is a massive excitation associated with the longitudinal mode and whose presence leads to instability in the spectrum of the theory. By using the principal-value prescription, we show that no one-loop radiative corrections to the mass term is generated. Moreover, the bumblebee self-energy is not transverse, showing that the propagation of the longitudinal mode cannot be excluded from the effective theory.

  7. Correct Linearization of Einstein's Equations

    Directory of Open Access Journals (Sweden)

    Rabounski D.

    2006-04-01

    Full Text Available Routinely, Einstein’s equations are be reduced to a wave form (linearly independent of the second derivatives of the space metric in the absence of gravitation, the space rotation and Christoffel’s symbols. As shown herein, the origin of the problem is the use of the general covariant theory of measurement. Herein the wave form of Einstein’s equations is obtained in terms of Zelmanov’s chronometric invariants (physically observable projections on the observer’s time line and spatial section. The equations so obtained depend solely upon the second derivatives, even for gravitation, the space rotation and Christoffel’s symbols. The correct linearization proves that the Einstein equations are completely compatible with weak waves of the metric.

  8. Inflation from nilpotent Kaehler corrections

    Energy Technology Data Exchange (ETDEWEB)

    McDonough, Evan [McGill Univ. Montreal, QC (Canada); Scalisi, Marco [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-09-15

    We develop a new class of supergravity cosmological models where inflation is induced by terms in the Kaehler potential which mix a nilpotent superfield S with a chiral sector Φ. As the new terms are non-(anti)holomorphic, and hence cannot be removed by a Kaehler transformation, these models are intrinsically Kaehler potential driven. Such terms could arise for example due to a backreaction of an anti-D3 brane on the string theory bulk geometry. We show that this mechanism is very general and allows for a unified description of inflation and dark energy, with controllable SUSY breaking at the vacuum. When the internal geometry of the bulk field is hyperbolic, we prove that small perturbative Kaehler corrections naturally lead to α-attractor behaviour, with inflationary predictions in excellent agreement with the latest Planck data.

  9. A quantum correction to chaos

    Energy Technology Data Exchange (ETDEWEB)

    Fitzpatrick, A. Liam [Department of Physics, Boston University,590 Commonwealth Avenue, Boston, MA 02215 (United States); Kaplan, Jared [Department of Physics and Astronomy, Johns Hopkins University,3400 N. Charles St, Baltimore, MD 21218 (United States)

    2016-05-12

    We use results on Virasoro conformal blocks to study chaotic dynamics in CFT{sub 2} at large central charge c. The Lyapunov exponent λ{sub L}, which is a diagnostic for the early onset of chaos, receives 1/c corrections that may be interpreted as λ{sub L}=((2π)/β)(1+(12/c)). However, out of time order correlators receive other equally important 1/c suppressed contributions that do not have such a simple interpretation. We revisit the proof of a bound on λ{sub L} that emerges at large c, focusing on CFT{sub 2} and explaining why our results do not conflict with the analysis leading to the bound. We also comment on relationships between chaos, scattering, causality, and bulk locality.

  10. Radiative corrections in bumblebee electrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Maluf, R.V., E-mail: r.v.maluf@fisica.ufc.br [Universidade Federal do Ceará (UFC), Departamento de Física, Campus do Pici, Fortaleza CE, CP 6030, 60455-760 (Brazil); Silva, J.E.G., E-mail: jgsilva@indiana.edu [Indiana University Center for Spacetime Symmetries, Bloomington, IN 47405 (United States); Almeida, C.A.S., E-mail: carlos@fisica.ufc.br [Universidade Federal do Ceará (UFC), Departamento de Física, Campus do Pici, Fortaleza CE, CP 6030, 60455-760 (Brazil)

    2015-10-07

    We investigate some quantum features of the bumblebee electrodynamics in flat spacetimes. The bumblebee field is a vector field that leads to a spontaneous Lorentz symmetry breaking. For a smooth quadratic potential, the massless excitation (Nambu–Goldstone boson) can be identified as the photon, transversal to the vacuum expectation value of the bumblebee field. Besides, there is a massive excitation associated with the longitudinal mode and whose presence leads to instability in the spectrum of the theory. By using the principal-value prescription, we show that no one-loop radiative corrections to the mass term is generated. Moreover, the bumblebee self-energy is not transverse, showing that the propagation of the longitudinal mode cannot be excluded from the effective theory.

  11. Static Correctness of Hierarchical Procedures

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1990-01-01

    basis for a general type hierarchy with static type checking, which enables first-order polymorphism combined with multiple inheritance and specialization in a language with assignments. We extend the results to include opaque types. An opaque version of a type is different from the original but has......A system of hierarchical, fully recursive types in a truly imperative language allows program fragments written for small types to be reused for all larger types. To exploit this property to enable type-safe hierarchical procedures, it is necessary to impose a static requirement on procedure calls....... We introduce an example language and prove the existence of a sound requirement which preserves static correctness while allowing hierarchical procedures. This requirement is further shown to be optimal, in the sense that it imposes as few restrictions as possible. This establishes the theoretical...

  12. Pileup correction of microdosimetric spectra

    CERN Document Server

    Langen, K M; Lennox, A J; Kroc, T K; De Luca, P M

    2002-01-01

    Microdosimetric spectra were measured at the Fermilab neutron therapy facility using low pressure proportional counters operated in pulse mode. The neutron beam has a very low duty cycle (<0.1%) and consequently a high instantaneous dose rate which causes distortions of the microdosimetric spectra due to pulse pileup. The determination of undistorted spectra at this facility necessitated (i) the modified operation of the proton accelerator to reduce the instantaneous dose rate and (ii) the establishment of a computational procedure to correct the measured spectra for remaining pileup distortions. In support of the latter effort, two different pileup simulation algorithms using analytical and Monte-Carlo-based approaches were developed. While the analytical algorithm allows a detailed analysis of pileup processes it only treats two-pulse and three-pulse pileup and its validity is hence restricted. A Monte-Carlo-based pileup algorithm was developed that inherently treats all degrees of pileup. This algorithm...

  13. Metrology Standards for Quantitative Imaging Biomarkers.

    Science.gov (United States)

    Sullivan, Daniel C; Obuchowski, Nancy A; Kessler, Larry G; Raunig, David L; Gatsonis, Constantine; Huang, Erich P; Kondratovich, Marina; McShane, Lisa M; Reeves, Anthony P; Barboriak, Daniel P; Guimaraes, Alexander R; Wahl, Richard L

    2015-12-01

    Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies.

  14. Quantitative MR Image Analysis for Brian Tumor.

    Science.gov (United States)

    Shboul, Zeina A; Reza, Sayed M S; Iftekharuddin, Khan M

    2018-01-01

    This paper presents an integrated quantitative MR image analysis framework to include all necessary steps such as MRI inhomogeneity correction, feature extraction, multiclass feature selection and multimodality abnormal brain tissue segmentation respectively. We first obtain mathematical algorithm to compute a novel Generalized multifractional Brownian motion (GmBm) texture feature. We then demonstrate efficacy of multiple multiresolution texture features including regular fractal dimension (FD) texture, and stochastic texture such as multifractional Brownian motion (mBm) and GmBm features for robust tumor and other abnormal tissue segmentation in brain MRI. We evaluate these texture and associated intensity features to effectively delineate multiple abnormal tissues within and around the tumor core, and stroke lesions using large scale public and private datasets.

  15. Quantitative Fluorescence Measurements with Multicolor Flow Cytometry.

    Science.gov (United States)

    Wang, Lili; Gaigalas, Adolfas K; Wood, James

    2018-01-01

    Multicolor flow cytometer assays are routinely used in clinical laboratories for immunophenotyping, monitoring disease and treatment, and determining prognostic factors. However, existing methods for quantitative measurements have not yet produced satisfactory results independent of flow cytometers used. This chapter details a procedure for quantifying surface and intracellular protein biomarkers by calibrating the output of a multicolor flow cytometer in units of antibodies bound per cell (ABC). The procedure includes the following critical steps: (a) quality control (QC) and performance characterization of the multicolor flow cytometer, (b) fluorescence calibration using hard dyed microspheres assigned with fluorescence intensity values in equivalent number of reference fluorophores (ERF), (c) compensation for correction of fluorescence spillover, and (d) application of a biological reference standard for translating the ERF scale to the ABC scale. The chapter also points out current efforts for implementing quantification of biomarkers in a manner which is independent of instrument platforms and reagent differences.

  16. Early detection of complete vascular occlusion in a pedicle flap model using quantitative [corrected] spectral imaging.

    Science.gov (United States)

    Pharaon, Michael R; Scholz, Thomas; Bogdanoff, Scott; Cuccia, David; Durkin, Anthony J; Hoyt, David B; Evans, Gregory R D

    2010-12-01

    Vascular occlusion after tissue transfer is a devastating complication that can lead to complete flap loss. Spatial frequency domain imaging is a new, noncontact, noninvasive, wide-field imaging technology capable of quantifying oxygenated and deoxygenated hemoglobin levels, total hemoglobin, and tissue saturation. Pedicled fasciocutaneous flaps on Wistar rats (400 to 500 g) were created and underwent continuous imaging using spatial frequency domain imaging before and after selective vascular occlusion. Three flap groups (control, selective arterial occlusion, and selective venous occlusion) and a fourth group composed of native skin between the flaps were measured. There were no statistically significant differences between the control flap group and the experimental flap groups before selective vascular occlusion: oxyhemoglobin (p=0.2017), deoxyhemoglobin (p=0.3145), total hemoglobin (p=0.2718), and tissue saturation, (p=0.0777). In the selective arterial occlusion flap group, percentage change in total hemoglobin was statistically different from that of the control flap group (p=0.0218). The remaining parameters were not statistically different from those of the control flap: percentage change in oxyhemoglobin (p=0.0888), percentage change in deoxyhemoglobin (p=0.5198), and percentage change in tissue saturation (p=0.4220). The selective venous occlusion flap group demonstrated changes statistically different compared with the control flap group: percentage change in oxyhemoglobin (p=0.0029) and deoxyhemoglobin, total hemoglobin, and tissue saturation (poxyhemoglobin, deoxyhemoglobin, total hemoglobin, and tissue saturation. Results presented here indicate that this can be used to quantify and detect physiologic changes that occur after arterial and venous occlusion in a rodent tissue transfer flap model. This portable, noncontact, noninvasive device may have a high clinical applicability in monitoring postoperative patients.

  17. Method for beam hardening correction in quantitative computed X-ray tomography

    Science.gov (United States)

    Yan, Chye Hwang (Inventor); Whalen, Robert T. (Inventor); Napel, Sandy (Inventor)

    2001-01-01

    Each voxel is assumed to contain exactly two distinct materials, with the volume fraction of each material being iteratively calculated. According to the method, the spectrum of the X-ray beam must be known, and the attenuation spectra of the materials in the object must be known, and be monotonically decreasing with increasing X-ray photon energy. Then, a volume fraction is estimated for the voxel, and the spectrum is iteratively calculated.

  18. Correct quantitative determination of ethanol and volatile compounds in alcohol products

    CERN Document Server

    Charapitsa, Siarhei; Sytova, Svetlana; Yakuba, Yurii

    2014-01-01

    Determination of the volume content of ethanol in the alcohol products in practice is usually determined by pycnometry, electronic densimetry, or densimetry using a hydrostatic balance in accordance with Commission Regulation No 2870/2000. However, these methods determine directly only density of the tested liquid sample and does not take into account the effects of other volatile components such as aldehydes, esters and higher alcohols. So they are appropriate only for binary water-ethanol solutions in accordance with international table adopted by the International Legal Metrology Organization in its Recommendation No 22. Availability notable concentrations of the higher alcohols and ethers in different alcohol-based products, e. g. in whisky, cognac, brandy, wine as well as in waste alcohol and alcohol beverage production, leads to the significant contribution of these compounds in the value of the density of tested alcohol-containing sample. As a result, determination of the volume of ethanol content for ...

  19. Quantitative analysis of X-band weather radar attenuation correction accuracy

    NARCIS (Netherlands)

    Berne, A.D.; Uijlenhoet, R.

    2006-01-01

    At short wavelengths, especially C-, X-, and K-band, weather radar signals arc attenuated by the precipitation along their paths. This constitutes a major source of error for radar rainfall estimation, in particular for intense precipitation. A recently developed stochastic simulator of range

  20. Surface plasmon resonance microscopy: Achieving a quantitative optical response

    Science.gov (United States)

    Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.

    2016-09-01

    Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based figuration. We carry out SPR imaging on a microscope by launching light into a sample and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy.

  1. Surface electromyography analysis of blepharoptosis correction by transconjunctival incisions.

    Science.gov (United States)

    Tu, Lung-Chen; Wu, Ming-Chya; Chu, Hsueh-Liang; Chiang, Yi-Pin; Kuo, Chih-Lin; Li, Hsing-Yuan; Chang, Chia-Ching

    2016-06-01

    Upper eyelid movement depends on the antagonistic actions of orbicularis oculi muscle and levator aponeurosis. Blepharoptosis is an abnormal drooping of upper eyelid margin with the eye in primary position of gaze. Transconjunctival incisions for upper eyelid ptosis correction have been a well-developed technique. Conventional prognosis however depends on clinical observations and lacks of quantitatively analysis for the eyelid muscle controlling. This study examines the possibility of using the assessments of temporal correlation in surface electromyography (SEMG) as a quantitative description for the change of muscle controlling after operation. Eyelid SEMG was measured from patients with blepharoptosis preoperatively and postoperatively, as well as, for comparative study, from young and aged normal subjects. The data were analyzed using the detrended fluctuation analysis method. The results show that the temporal correlation of the SEMG signals can be characterized by two indices associated with the correlation properties in short and long time scales demarcated at 3ms, corresponding to the time scale of neural response. Aging causes degradation of the correlation properties at both time scales, and patient group likely possess more serious correlation degradation in long-time regime which was improved moderately by the ptosis corrections. We propose that the temporal correlation in SEMG signals may be regarded as an indicator for evaluating the performance of eyelid muscle controlling in postoperative recovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  3. Quantitative Computertomographie (QCT

    Directory of Open Access Journals (Sweden)

    Krestan C

    2013-01-01

    Full Text Available Die zentrale quantitative Computertomographie ist ein etabliertes Verfahren zur Knochendichtemessung. Die QCT kann an zentralen und peripheren Messorten durchgeführt werden, wobei die wichtigste zentrale Messregion die Lendenwirbelsäule ist. Die QCT unterscheidet sich von der DXA durch eine 3-dimensionale Messung im Vergleich zur 2-dimensionalen DXA-Untersuchung. Die T-Score-Definition der Osteoporose sollte nicht anhand von QCT-Untersuchungen verwendet werden, da ein Schwellwert von –2,5 zu einer deutlich höheren Prävalenz osteoporotischer Individuen führen würde. Stattdessen wurden Absolutwerte der Knochenmineraldichte für QCT vorgeschlagen. Die Bestimmung der Knochenmineraldichte aus Routine-CT-Untersuchungen stellt einen neuen Trend in der Osteoporosediagnostik dar. Neben der reinen Knochenmineraldichte ist die periphere QCT – und insbesondere die HR-(„high-resolution“- pQCT – in der Lage, Parameter über die trabekuläre und kortikale Knochenqualität zu bestimmen. Die Untersuchungspräzision ist für periphere QCT-Verfahren größer als für zentrale Messorte, was für Verlaufskontrollen relevant ist.

  4. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  5. Comparison and correction of three satellite precipitation estimates products to improve flood prevention in French Guiana

    Science.gov (United States)

    Beaufort, Aurélien; Gibier, Florian; Palany, Philippe

    2017-04-01

    The French Guiana (80 000 km2) is highly vulnerable to flooding during the rainy season but the hydrological prevision is limited. In fact, the region cannot be cover by a dense network of rain gauges because of the difficulties to install and maintain monitoring stations in the primary forest. In that case, meteorological satellites could be really useful. Their large spatial cover offers the opportunity to estimate rainfall at a regional scale, with a temporal resolution of 30 minutes. The use of daily satellite precipitation estimates products in hydrological modelling are not very developed but could lead to reduce spatiotemporal uncertainties of rainfall and improve simulations of hydrological models. In this study, we have tested three satellite-based rainfall estimation algorithms: TRMM-TMPA 3B42 (Tropical Rainfall Measuring Mission Multi-Satellite Precipitation analysis) V7 (spatial resolution: 0.25°), IMERG (Integrated Multi-satellitE Retrievals) for GPM (Global Precipitation Measurement) (spatial resolution: 0.1°) and STAR Satellite rainfall estimates Hydro-Estimator (spatial resolution: 0.045°). Then, we applied several methods for biases correction in order to improve daily rainfall estimates in comparison with measures from available rain gauges. The performance was evaluated at a daily time scale for the period running from 01/04/2015 to 30/03/2016 with validation data from 32 rain gauges managed by Meteo France and 59 rain gauges managed by the Surinam. Before biases correction, GPM IMERG obtained the better percentage of detection (POD) with 70% and a false alarm ratio (FAR) of only 10% in comparison with TRMM performance (POD: 60% ; FAR: 30%) and Hydro-Estimator (POD: 30% ; FAR: 30%). Biases (Psat-Pgau) were negatives with the three satellite products which mean that rainfall estimates by satellite images were underestimated. Better daily performances were obtained with TRMM (mean absolute biases: 7.1 mm; RMSE = 13.4 mm) and GPM (mean absolute

  6. A multiresolution image based approach for correction of partial volume effects in emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Boussion, N; Hatt, M; Lamare, F; Bizais, Y; Turzo, A; Rest, C Cheze-Le; Visvikis, D [INSERM U650, Laboratoire du Traitement de l' Information Medicale (LaTIM), CHU Morvan, Brest (France)

    2006-04-07

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography. They lead to a loss of signal in tissues of size similar to the point spread function and induce activity spillover between regions. Although PVE can be corrected for by using algorithms that provide the correct radioactivity concentration in a series of regions of interest (ROIs), so far little attention has been given to the possibility of creating improved images as a result of PVE correction. Potential advantages of PVE-corrected images include the ability to accurately delineate functional volumes as well as improving tumour-to-background ratio, resulting in an associated improvement in the analysis of response to therapy studies and diagnostic examinations, respectively. The objective of our study was therefore to develop a methodology for PVE correction not only to enable the accurate recuperation of activity concentrations, but also to generate PVE-corrected images. In the multiresolution analysis that we define here, details of a high-resolution image H (MRI or CT) are extracted, transformed and integrated in a low-resolution image L (PET or SPECT). A discrete wavelet transform of both H and L images is performed by using the 'a trous' algorithm, which allows the spatial frequencies (details, edges, textures) to be obtained easily at a level of resolution common to H and L. A model is then inferred to build the lacking details of L from the high-frequency details in H. The process was successfully tested on synthetic and simulated data, proving the ability to obtain accurately corrected images. Quantitative PVE correction was found to be comparable with a method considered as a reference but limited to ROI analyses. Visual improvement and quantitative correction were also obtained in two examples of clinical images, the first using a combined PET/CT scanner with a lymphoma patient and the second using a FDG brain PET and corresponding T1

  7. Pulse compressor with aberration correction

    Energy Technology Data Exchange (ETDEWEB)

    Mankos, Marian [Electron Optica, Inc., Palo Alto, CA (United States)

    2015-11-30

    In this SBIR project, Electron Optica, Inc. (EOI) is developing an electron mirror-based pulse compressor attachment to new and retrofitted dynamic transmission electron microscopes (DTEMs) and ultrafast electron diffraction (UED) cameras for improving the temporal resolution of these instruments from the characteristic range of a few picoseconds to a few nanoseconds and beyond, into the sub-100 femtosecond range. The improvement will enable electron microscopes and diffraction cameras to better resolve the dynamics of reactions in the areas of solid state physics, chemistry, and biology. EOI’s pulse compressor technology utilizes the combination of electron mirror optics and a magnetic beam separator to compress the electron pulse. The design exploits the symmetry inherent in reversing the electron trajectory in the mirror in order to compress the temporally broadened beam. This system also simultaneously corrects the chromatic and spherical aberration of the objective lens for improved spatial resolution. This correction will be found valuable as the source size is reduced with laser-triggered point source emitters. With such emitters, it might be possible to significantly reduce the illuminated area and carry out ultrafast diffraction experiments from small regions of the sample, e.g. from individual grains or nanoparticles. During phase I, EOI drafted a set of candidate pulse compressor architectures and evaluated the trade-offs between temporal resolution and electron bunch size to achieve the optimum design for two particular applications with market potential: increasing the temporal and spatial resolution of UEDs, and increasing the temporal and spatial resolution of DTEMs. Specialized software packages that have been developed by MEBS, Ltd. were used to calculate the electron optical properties of the key pulse compressor components: namely, the magnetic prism, the electron mirror, and the electron lenses. In the final step, these results were folded

  8. Investigation of Attenuation Correction for Small-Animal Single Photon Emission Computed Tomography

    Directory of Open Access Journals (Sweden)

    Hsin-Hui Lee

    2013-01-01

    Full Text Available The quantitative accuracy of SPECT is limited by photon attenuation and scatter effect when photons interact with atoms. In this study, we developed a new attenuation correction (AC method, CT-based mean attenuation correction (CTMAC method, and compared it with various methods that were often used currently to assess the AC phenomenon by using the small-animal SPECT/CT data that were acquired from various physical phantoms and a rat. The physical phantoms and an SD rat, which were injected with 99mTc, were scanned by a parallel-hole small-animal SPECT, and then they were imaged by the 80 kVp micro-CT. Scatter was estimated and corrected by the triple-energy window (TEW method. Absolute quantification was derived from a known activity point source scan. In the physical-phantom studies, we compared the images with original, scatter correction (SC only, and the scatter-corrected images with AC performed by using Chang’s method, CT-based attenuation correction (CTAC, CT-based iterative attenuation compensation during reconstruction (CTIACR, and the CTMAC. From the correction results, we find out that the errors of the previous six configurations are mostly quite similar. The CTMAC needs the shortest correction time while obtaining good AC results.

  9. Quantitative Luminescence Imaging System

    Energy Technology Data Exchange (ETDEWEB)

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  10. Quantitative computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Judith E. [Royal Infirmary and University, Manchester (United Kingdom)], E-mail: judith.adams@manchester.ac.uk

    2009-09-15

    Quantitative computed tomography (QCT) was introduced in the mid 1970s. The technique is most commonly applied to 2D slices in the lumbar spine to measure trabecular bone mineral density (BMD; mg/cm{sup 3}). Although not as widely utilized as dual-energy X-ray absortiometry (DXA) QCT has some advantages when studying the skeleton (separate measures of cortical and trabecular BMD; measurement of volumetric, as opposed to 'areal' DXA-BMDa, so not size dependent; geometric and structural parameters obtained which contribute to bone strength). A limitation is that the World Health Organisation (WHO) definition of osteoporosis in terms of bone densitometry (T score -2.5 or below using DXA) is not applicable. QCT can be performed on conventional body CT scanners, or at peripheral sites (radius, tibia) using smaller, less expensive dedicated peripheral CT scanners (pQCT). Although the ionising radiation dose of spinal QCT is higher than for DXA, the dose compares favorably with those of other radiographic procedures (spinal radiographs) performed in patients suspected of having osteoporosis. The radiation dose from peripheral QCT scanners is negligible. Technical developments in CT (spiral multi-detector CT; improved spatial resolution) allow rapid acquisition of 3D volume images which enable QCT to be applied to the clinically important site of the proximal femur, more sophisticated analysis of cortical and trabecular bone, the imaging of trabecular structure and the application of finite element analysis (FEA). Such research studies contribute importantly to the understanding of bone growth and development, the effect of disease and treatment on the skeleton and the biomechanics of bone strength and fracture.

  11. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  12. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  13. Hypernatremia: Correction Rate and Hemodialysis

    Directory of Open Access Journals (Sweden)

    Saima Nur

    2014-01-01

    Full Text Available Severe hypernatremia is defined as serum sodium levels above 152 mEq/L, with a mortality rate ≥60%. 85-year-old gentleman was brought to the emergency room with altered level of consciousness after refusing to eat for a week at a skilled nursing facility. On admission patient was nonverbal with stable vital signs and was responsive only to painful stimuli. Laboratory evaluation was significant for serum sodium of 188 mmol/L and water deficit of 12.0 L. Patient was admitted to medicine intensive care unit and after inadequate response to suboptimal fluid repletion, hemodialysis was used to correct hypernatremia. Within the first fourteen hours, sodium concentration only changed 1 mEq/L with a fluid repletion; however, the concentration dropped greater than 20 mEq/L within two hours during hemodialysis. Despite such a drastic drop in sodium concentration, patient did not develop any neurological sequela and was at baseline mental status at the time of discharge.

  14. Phase and birefringence aberration correction

    Science.gov (United States)

    Bowers, M.; Hankla, A.

    1996-07-09

    A Brillouin enhanced four wave mixing phase conjugate mirror corrects phase aberrations of a coherent electromagnetic beam and birefringence induced upon that beam. The stimulated Brillouin scattering (SBS) phase conjugation technique is augmented to include Brillouin enhanced four wave mixing (BEFWM). A seed beam is generated by a main oscillator which arrives at the phase conjugate cell before the signal beams in order to initiate the Brillouin effect. The signal beam which is being amplified through the amplifier chain is split into two perpendicularly polarized beams. One of the two beams is chosen to be the same polarization as some component of the seed beam, the other orthogonal to the first. The polarization of the orthogonal beam is then rotated 90{degree} such that it is parallel to the other signal beam. The three beams are then focused into cell containing a medium capable of Brillouin excitation. The two signal beams are focused such that they cross the seed beam path before their respective beam waists in order to achieve BEFWM or the two signal beams are focused to a point or points contained within the focused cone angle of the seed beam to achieve seeded SBS, and thus negate the effects of all birefringent and material aberrations in the system. 5 figs.

  15. Rulison Site corrective action report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    Project Rulison was a joint US Atomic Energy Commission (AEC) and Austral Oil Company (Austral) experiment, conducted under the AEC`s Plowshare Program, to evaluate the feasibility of using a nuclear device to stimulate natural gas production in low-permeability gas-producing geologic formations. The experiment was conducted on September 10, 1969, and consisted of detonating a 40-kiloton nuclear device at a depth of 2,568 m below ground surface (BGS). This Corrective Action Report describes the cleanup of petroleum hydrocarbon- and heavy-metal-contaminated sediments from an old drilling effluent pond and characterization of the mud pits used during drilling of the R-EX well at the Rulison Site. The Rulison Site is located approximately 65 kilometers (40 miles) northeast of Grand Junction, Colorado. The effluent pond was used for the storage of drilling mud during drilling of the emplacement hole for the 1969 gas stimulation test conducted by the AEC. This report also describes the activities performed to determine whether contamination is present in mud pits used during the drilling of well R-EX, the gas production well drilled at the site to evaluate the effectiveness of the detonation in stimulating gas production. The investigation activities described in this report were conducted during the autumn of 1995, concurrent with the cleanup of the drilling effluent pond. This report describes the activities performed during the soil investigation and provides the analytical results for the samples collected during that investigation.

  16. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  17. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  18. Quantitative imaging with a mobile phone microscope.

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D; Switz, Neil A; Fletcher, Daniel A

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  19. Workshop on quantitative dynamic stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  20. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  1. Corrective Action Decision Document for Corrective Action Unit 340: Pesticide Release sites, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    DOE/NV

    1998-12-08

    This Corrective Action Decision Document has been prepared for Corrective Action Unit 340, the NTS Pesticide Release Sites, in accordance with the Federal Facility Agreement and Consent Order of 1996 (FFACO, 1996). Corrective Action Unit 340 is located at the Nevada Test Site, Nevada, and is comprised of the following Corrective Action Sites: 23-21-01, Area 23 Quonset Hut 800 Pesticide Release Ditch; 23-18-03, Area 23 Skid Huts Pesticide Storage; and 15-18-02, Area 15 Quonset Hut 15-11 Pesticide Storage. The purpose of this Corrective Action Decision Document is to identify and provide a rationale for the selection of a recommended corrective action alternative for each Corrective Action Site. The scope of this Corrective Action Decision Document consists of the following tasks: Develop corrective action objectives; Identify corrective action alternative screening criteria; Develop corrective action alternatives; Perform detailed and comparative evaluations of the corrective action alternatives in relation to the corrective action objectives and screening criteria; and Recommend and justify a preferred corrective action alternative for each Corrective Action Site.

  2. Quantitative analysis of 'calanchi

    Science.gov (United States)

    Agnesi, Valerio; Cappadonia, Chiara; Conoscenti, Christian; Costanzo, Dario; Rotigliano, Edoardo

    2010-05-01

    Three years (2006 - 2009) of monitoring data from two calanchi sites located in the western Sicilian Appennines are analyzed and discussed: the data comes from two networks of erosion pins and a rainfall gauge station. The aim of the present research is to quantitatively analyze the effects of erosion by water and to investigate their relationships with rainfall trends and specific properties of the two calanchi fronts. Each of the sites was equipped with a grid of randomly distributed erosion pins, made of 41 nodes for the "Catalfimo" site, and 13 nodes for the "Ottosalme" site (in light of the general homogeneity of its geomorphologic conditions); the erosion pins consist in 2 cm graded iron stakes, 100 cm long, with a section having a diameter of 1.6 cm. Repeated readings at the erosion pins allowed to estimate point topographic height variations; a total number of 21 surveys have been made remotely by acquiring high resolution photographs from a fixed view point. Since the two calanchi sites are very close each other (some hundred meters), a single rainfall gauge station was installed, assuming a strict climatic homogeneity of the investigated area. Rainfall data have been processed to derive the rain erosivity index signal, detecting a total number of 27 erosive events. Despite the close distance between the two sites, because of a different geologic setting, the calanchi fronts are characterized by the outcropping of different levels of the same formation (Terravecchia fm., Middle-Late Miocene); as a consequence, both mineralogical, textural and geotechnical (index) properties, as well as the topographic and geomorphologic characteristics, change. Therefore, in order to define the "framework" in which the two erosion pin grids have been installed, 40 samples of rock have been analyzed, and a geomorphologic detailed survey has been carried out; in particular, plasticity index, liquid limit, carbonate, pH, granulometric fractions and their mineralogic

  3. Motion correction in MRI of the brain

    Science.gov (United States)

    Godenschweger, F; Kägebein, U; Stucht, D; Yarach, U; Sciarra, A; Yakupov, R; Lüsebrink, F; Schulze, P; Speck, O

    2016-01-01

    Subject motion in MRI is a relevant problem in the daily clinical routine as well as in scientific studies. Since the beginning of clinical use of MRI, many research groups have developed methods to suppress or correct motion artefacts. This review focuses on rigid body motion correction of head and brain MRI and its application in diagnosis and research. It explains the sources and types of motion and related artefacts, classifies and describes existing techniques for motion detection, compensation and correction and lists established and experimental approaches. Retrospective motion correction modifies the MR image data during the reconstruction, while prospective motion correction performs an adaptive update of the data acquisition. Differences, benefits and drawbacks of different motion correction methods are discussed. PMID:26864183

  4. Quantum corrections to the string Bethe ansatz

    CERN Document Server

    Hernández, R; Hernandez, Rafael; Lopez, Esperanza

    2006-01-01

    One-loop corrections to the energy of semiclassical rotating strings contain both analytic and non-analytic terms in the 't Hooft coupling. Analytic contributions agree with the prediction from the string Bethe ansatz based on the classical S-matrix, but in order to include non-analytic contributions quantum corrections are required. We find a general expression for the first quantum correction to the string Bethe ansatz.

  5. [Atmospheric correction method for HJ-1 CCD imagery over waters based on radiative transfer model].

    Science.gov (United States)

    Xu, Hua; Gu, Xing-Fa; Li, Zheng-Qiang; Li, Li; Chen, Xing-Feng

    2011-10-01

    Atmospheric correction is a bottleneck in quantitative application of Chinese satellites HJ-1 data to remote sensing of water color. According to the characteristics of CCD sensors, the present paper made use of air-water coupled radiative transfer model to work out the look-up table (LUT) of atmospheric corrected parameters, and thereafter developed pixel-by-pixel atmospheric correction method over waters accomplishing the water-leaving remote sensing reflectance with accessorial meteorological input. The paper validates the HJ-1 CCD retrievals with MODIS and in-situ results. It was found that the accuracy in blue and green bands is good. However, the accuracy in red or NIR bands is much worse than blue or green ones. It was also demonstrated that the aerosol model is a sensitive factor to the atmospheric correction accuracy.

  6. Automatic procedure for the correction of thermoelastic stress analysis data acquired in nonadiabatic conditions

    Science.gov (United States)

    Gallotti, A.; Salerno, A.

    2005-12-01

    Thermoelastic stress analysis (TSA), performed on metallic components with a high diffusivity coefficient, seldom reaches adiabatic conditions. As a consequence, TSA results are affected by an attenuation whose entity varies locally, preventing the use of TSA as a reliable quantitative investigation means. The recovery of the adiabatic temperature and of the correct value of the first stress invariant, directly linked to it, can only be performed making an assumption on the local stress distribution. This article presents a method for automatically choosing, among a number of stress distribution functions, the one that performs the best correction of the raw TSA data. The implementation of this automatic correction procedure in a computer program allowed the point-by-point correction of whole TSA images.

  7. Quantum corrections to Schwarzschild black hole

    Energy Technology Data Exchange (ETDEWEB)

    Calmet, Xavier; El-Menoufi, Basem Kamal [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom)

    2017-04-15

    Using effective field theory techniques, we compute quantum corrections to spherically symmetric solutions of Einstein's gravity and focus in particular on the Schwarzschild black hole. Quantum modifications are covariantly encoded in a non-local effective action. We work to quadratic order in curvatures simultaneously taking local and non-local corrections into account. Looking for solutions perturbatively close to that of classical general relativity, we find that an eternal Schwarzschild black hole remains a solution and receives no quantum corrections up to this order in the curvature expansion. In contrast, the field of a massive star receives corrections which are fully determined by the effective field theory. (orig.)

  8. Revisiting corrective saccades: role of visual feedback

    Science.gov (United States)

    Tian, Jing; Ying, Howard S.; Zee, David S.

    2013-01-01

    To clarify the role of visual feedback in the generation of corrective movements after inaccurate primary saccades, we used a visually-triggered saccade task in which we varied how long the target was visible. The target was on for only 100 ms (OFF100ms), on until the start of the primary saccade (OFFonset) or on for 2 s (ON). We found that the tolerance for the post-saccadic error was small (− 2%) with a visual signal (ON) but greater (−6%) without visual feedback (OFF100ms). Saccades with an error of −10%, however, were likely to be followed by corrective saccades regardless of whether or not visual feedback was present. Corrective saccades were generally generated earlier when visual error information was available; their latency was related to the size of the error. The LATER (Linear Approach to Threshold with Ergodic Rate) model analysis also showed a comparable small population of short latency corrective saccades irrespective of the target visibility. Finally, we found, in the absence of visual feedback, the accuracy of corrective saccades across subjects was related to the latency of the primary saccade. Our findings provide new insights into the mechanisms underlying the programming of corrective saccades: 1) the preparation of corrective saccades begins along with the preparation of the primary saccades, 2) the accuracy of corrective saccades depends on the reaction time of the primary saccades and 3) if visual feedback is available after the initiation of the primary saccade, the prepared correction can be updated. PMID:23891705

  9. Wavelength and End Correction in a Recorder

    Directory of Open Access Journals (Sweden)

    Shuaihang (Susan Wang

    2009-01-01

    Full Text Available The wavelength and end correction was investigated as a function of the tube length of a recorder over a range of frequencies. It was found that the period of the sound produced varies linearly with the recorder’s tube length, as expected. It was also found that the end correction does not vary as a function of frequency. However, the end correction at the hole was found to be much greater than the end correction at the end of a resonating tube.

  10. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    Science.gov (United States)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  11. The satellite-based remote sensing of particulate matter (PM) in support to urban air quality: PM variability and hot spots within the Cordoba city (Argentina) as revealed by the high-resolution MAIAC-algorithm retrievals applied to a ten-years dataset (2

    Science.gov (United States)

    Della Ceca, Lara Sofia; Carreras, Hebe A.; Lyapustin, Alexei I.; Barnaba, Francesca

    2016-04-01

    Particulate matter (PM) is one of the major harmful pollutants to public health and the environment [1]. In developed countries, specific air-quality legislation establishes limit values for PM metrics (e.g., PM10, PM2.5) to protect the citizens health (e.g., European Commission Directive 2008/50, US Clean Air Act). Extensive PM measuring networks therefore exist in these countries to comply with the legislation. In less developed countries air quality monitoring networks are still lacking and satellite-based datasets could represent a valid alternative to fill observational gaps. The main PM (or aerosol) parameter retrieved from satellite is the 'aerosol optical depth' (AOD), an optical parameter quantifying the aerosol load in the whole atmospheric column. Datasets from the MODIS sensors on board of the NASA spacecrafts TERRA and AQUA are among the longest records of AOD from space. However, although extremely useful in regional and global studies, the standard 10 km-resolution MODIS AOD product is not suitable to be employed at the urban scale. Recently, a new algorithm called Multi-Angle Implementation of Atmospheric Correction (MAIAC) was developed for MODIS, providing AOD at 1 km resolution [2]. In this work, the MAIAC AOD retrievals over the decade 2003-2013 were employed to investigate the spatiotemporal variation of atmospheric aerosols over the Argentinean city of Cordoba and its surroundings, an area where a very scarce dataset of in situ PM data is available. The MAIAC retrievals over the city were firstly validated using a 'ground truth' AOD dataset from the Cordoba sunphotometer operating within the global AERONET network [3]. This validation showed the good performances of the MAIAC algorithm in the area. The satellite MAIAC AOD dataset was therefore employed to investigate the 10-years trend as well as seasonal and monthly patterns of particulate matter in the Cordoba city. The first showed a marked increase of AOD over time, particularly evident in

  12. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  13. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  14. Mixing quantitative with qualitative methods:

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  15. A two-dimensional matrix correction for off-axis portal dose prediction errors

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Daniel W. [Department of Physics, State University of New York at Buffalo, Buffalo, New York 14260 (United States); Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Kumaraswamy, Lalith; Bakhtiari, Mohammad [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Podgorsak, Matthew B. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 and Department of Physiology and Biophysics, State University of New York at Buffalo, Buffalo, New York 14214 (United States)

    2013-05-15

    Purpose: This study presents a follow-up to a modified calibration procedure for portal dosimetry published by Bailey et al. ['An effective correction algorithm for off-axis portal dosimetry errors,' Med. Phys. 36, 4089-4094 (2009)]. A commercial portal dose prediction system exhibits disagreement of up to 15% (calibrated units) between measured and predicted images as off-axis distance increases. The previous modified calibration procedure accounts for these off-axis effects in most regions of the detecting surface, but is limited by the simplistic assumption of radial symmetry. Methods: We find that a two-dimensional (2D) matrix correction, applied to each calibrated image, accounts for off-axis prediction errors in all regions of the detecting surface, including those still problematic after the radial correction is performed. The correction matrix is calculated by quantitative comparison of predicted and measured images that span the entire detecting surface. The correction matrix was verified for dose-linearity, and its effectiveness was verified on a number of test fields. The 2D correction was employed to retrospectively examine 22 off-axis, asymmetric electronic-compensation breast fields, five intensity-modulated brain fields (moderate-high modulation) manipulated for far off-axis delivery, and 29 intensity-modulated clinical fields of varying complexity in the central portion of the detecting surface. Results: Employing the matrix correction to the off-axis test fields and clinical fields, predicted vs measured portal dose agreement improves by up to 15%, producing up to 10% better agreement than the radial correction in some areas of the detecting surface. Gamma evaluation analyses (3 mm, 3% global, 10% dose threshold) of predicted vs measured portal dose images demonstrate pass rate improvement of up to 75% with the matrix correction, producing pass rates that are up to 30% higher than those resulting from the radial correction technique alone

  16. Publisher Correction: Invisible Trojan-horse attack

    DEFF Research Database (Denmark)

    Sajeed, Shihan; Minshull, Carter; Jain, Nitin

    2017-01-01

    A correction to this article has been published and is linked from the HTML version of this paper. The error has been fixed in the paper.......A correction to this article has been published and is linked from the HTML version of this paper. The error has been fixed in the paper....

  17. 21 CFR 120.10 - Corrective actions.

    Science.gov (United States)

    2010-04-01

    ... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.10 Corrective... develop written corrective action plans, which become part of their HACCP plans in accordance with § 120.8... have been trained in accordance with § 120.13, to determine whether modification of the HACCP plan is...

  18. 21 CFR 123.7 - Corrective actions.

    Science.gov (United States)

    2010-04-01

    ... of their HACCP plans in accordance with § 123.6(c)(5), by which they predetermine the corrective... in accordance with § 123.10, to determine whether the HACCP plan needs to be modified to reduce the risk of recurrence of the deviation, and modify the HACCP plan as necessary. (d) All corrective actions...

  19. 9 CFR 417.3 - Corrective actions.

    Science.gov (United States)

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.3 Corrective actions. (a) The written HACCP plan.... The HACCP plan shall describe the corrective action to be taken, and assign responsibility for taking... identified deviation or other unforeseen hazard should be incorporated into the HACCP plan. (c) All...

  20. FISICO: Fast Image SegmentatIon COrrection.

    Directory of Open Access Journals (Sweden)

    Waldo Valenzuela

    Full Text Available In clinical diagnosis, medical image segmentation plays a key role in the analysis of pathological regions. Despite advances in automatic and semi-automatic segmentation techniques, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a lower number of interactions, and a user-independent solution to reduce the time frame between image acquisition and diagnosis.We present a new interactive method for correcting image segmentations. Our method provides 3D shape corrections through 2D interactions. This approach enables an intuitive and natural corrections of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle and knee joint segmentations from MR images.Experimental results show that full segmentation corrections could be performed within an average correction time of 5.5±3.3 minutes and an average of 56.5±33.1 user interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.02 for both anatomies. In addition, for users with different levels of expertise, our method yields a correction time and number of interaction decrease from 38±19.2 minutes to 6.4±4.3 minutes, and 339±157.1 to 67.7±39.6 interactions, respectively.

  1. 34 CFR 200.42 - Corrective action.

    Science.gov (United States)

    2010-07-01

    ... Programs Operated by Local Educational Agencies Lea and School Improvement § 200.42 Corrective action. (a... school on— (A) Revising the school improvement plan developed under § 200.41 to address the specific... corrective action; and (B) Implementing the revised improvement plan. (v) Extend for that school the length...

  2. A Hybrid Approach for Correcting Grammatical Errors

    Science.gov (United States)

    Lee, Kiyoung; Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun

    2015-01-01

    This paper presents a hybrid approach for correcting grammatical errors in the sentences uttered by Korean learners of English. The error correction system plays an important role in GenieTutor, which is a dialogue-based English learning system designed to teach English to Korean students. During the talk with GenieTutor, grammatical error…

  3. Correcting Poor Posture without Awareness or Willpower

    Science.gov (United States)

    Wernik, Uri

    2012-01-01

    In this article, a new technique for correcting poor posture is presented. Rather than intentionally increasing awareness or mobilizing willpower to correct posture, this approach offers a game using randomly drawn cards with easy daily assignments. A case using the technique is presented to emphasize the subjective experience of living with poor…

  4. Correction of errors in power measurements

    DEFF Research Database (Denmark)

    Pedersen, Knud Ole Helgesen

    1998-01-01

    Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors.......Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors....

  5. 77 FR 43111 - Indian Gaming; Correction

    Science.gov (United States)

    2012-07-23

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming; Correction AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact; Correction. SUMMARY: The Bureau of Indian...

  6. Radiative corrections to vector boson masses

    NARCIS (Netherlands)

    Veltman, M.J.G.

    1980-01-01

    Weak and e.m. radiative corrections to vector boson masses are computed. Including corrections due to the presently known leptons and quarks, mass shifts of+3080 and +3310 MeV are obtained for the masses of the charged and neutral vector boson.

  7. Correctional Officers' Attitudes toward Selected Treatment Programs.

    Science.gov (United States)

    Teske, Raymond H. C.; Williamson, Harold E.

    1979-01-01

    Examined the attitudes of a sample of correctional officers toward selected treatment programs. Besides a number of factors which correlated with positive attitudes toward treatment, several factors correlated negatively, including number of years of service and a belief that the primary function of corrections is punishment. (Author)

  8. A refined tip correction based on decambering

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær; Dag, Kaya Onur; Ramos García, Néstor

    2016-01-01

    A new tip correction for use in performance codes based on the blade element momentum (BEM) or the lifting-line techniqueis presented. The correction modifies the circulation by taking into account the additional influence of the inductionof the vortices in the wake, using the so-called decamberi...

  9. Evaluation of Sinus/Edge-Corrected Zero-Echo-Time-Based Attenuation Correction in Brain PET/MRI.

    Science.gov (United States)

    Yang, Jaewon; Wiesinger, Florian; Kaushik, Sandeep; Shanbhag, Dattesh; Hope, Thomas A; Larson, Peder E Z; Seo, Youngho

    2017-11-01

    In brain PET/MRI, the major challenge of zero-echo-time (ZTE)-based attenuation correction (ZTAC) is the misclassification of air/tissue/bone mixtures or their boundaries. Our study aimed to evaluate a sinus/edge-corrected (SEC) ZTAC (ZTACSEC), relative to an uncorrected (UC) ZTAC (ZTACUC) and a CT atlas-based attenuation correction (ATAC). Methods: Whole-body 18F-FDG PET/MRI scans were obtained for 12 patients after PET/CT scans. Only data acquired at a bed station that included the head were used for this study. Using PET data from PET/MRI, we applied ZTACUC, ZTACSEC, ATAC, and reference CT-based attenuation correction (CTAC) to PET attenuation correction. For ZTACUC, the bias-corrected and normalized ZTE was converted to pseudo-CT with air (-1,000 HU for ZTE 0.75), and bone (-2,000 × [ZTE - 1] + 42 HU for 0.2 ≤ ZTE ≤ 0.75). Afterward, in the pseudo-CT, sinus/edges were automatically estimated as a binary mask through morphologic processing and edge detection. In the binary mask, the overestimated values were rescaled below 42 HU for ZTACSEC For ATAC, the atlas deformed to MR in-phase was segmented to air, inner air, soft tissue, and continuous bone. For the quantitative evaluation, PET mean uptake values were measured in twenty 1-mL volumes of interest distributed throughout brain tissues. The PET uptake was compared using a paired t test. An error histogram was used to show the distribution of voxel-based PET uptake differences. Results: Compared with CTAC, ZTACSEC achieved the overall PET quantification accuracy (0.2% ± 2.4%, P = 0.23) similar to CTAC, in comparison with ZTACUC (5.6% ± 3.5%, P PET quantification in brain PET/MRI, comparable to the accuracy achieved by CTAC, particularly in the cerebellum. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  10. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    Science.gov (United States)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2017-12-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  11. NLO Corrections to the Photon Impact Factor: Combining Real and Virtual Corrections

    OpenAIRE

    Bartels, J; Colferai, D.; Gieseke, Stefan; Kyrieleis, A.

    2002-01-01

    In this third part of our calculation of the QCD NLO corrections to the photon impact factor we combine our previous results for the real corrections with the singular pieces of the virtual corrections and present finite analytic expressions for the quark-antiquark-gluon intermediate state inside the photon impact factor. We begin with a list of the infrared singular pieces of the virtual correction, obtained in the first step of our program. We then list the complete result...

  12. Energy shadowing correction of ultrasonic pulse-echo records by digital signal processing

    Science.gov (United States)

    Kishoni, D.; Heyman, J. S.

    1986-01-01

    Attention is given to a numerical algorithm that, via signal processing, enables the dynamic correction of the shadowing effect of reflections on ultrasonic displays. The algorithm was applied to experimental data from graphite-epoxy composite material immersed in a water bath. It is concluded that images of material defects with the shadowing corrections allow for a more quantitative interpretation of the material state. It is noted that the proposed algorithm is fast and simple enough to be adopted for real time applications in industry.

  13. Non rigid respiratory motion correction in whole body PET/MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Fayad, Hadi [INSERM UMR1101, LaTIM, Brest (France); Schmidt, Holger [Université de Bretagne Occidentale, Brest (France); Wuerslin, Christian [University Hospital of Tübingen (Germany); Visvikis, Dimitris [INSERM UMR1101, LaTIM, Brest (France)

    2014-07-29

    Respiratory motion in PET/MR imaging leads to reduced quantitative and qualitative image accuracy. Correction methodologies include the use of respiratory synchronized gated frames which lead to low signal to noise ratio (SNR) given that each frame contains only part of the count available throughout an average PET acquisition. In this work, 4D MRI extracted elastic transformations were applied to list-mode data either inside the image reconstruction or to the reconstructed respiratory synchronized images to obtain respiration corrected PET images.

  14. on the correctness of load loss factor correctness of load loss factor ...

    African Journals Online (AJOL)

    eobe

    Author Tel: +44 -753-526-7242. TECHNICAL NOTE: TECHNICAL NOTE: ON THE CORRECTNESS OF LOAD LOSS FACTOR. CORRECTNESS OF LOAD LOSS FACTOR. CORRECTNESS OF LOAD LOSS FACTOR. A. O. Ekwue*. JACOBS ENGINEERING INC/BRUNEL UNIVERSITY LONDON,UNITED KINGDOM.

  15. Corrective Action Plan for Corrective Action Unit 424: Area 3 Landfill Complex, Tonopah Test Range, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Bechtel Nevada

    1998-08-31

    This corrective action plan provides the closure implementation methods for the Area 3 Landfill Complex, Corrective Action Unit (CAU) 424, located at the Tonopah Test Range. The Area 3 Landfill Complex consists of 8 landfill sites, each designated as a separate corrective action site.

  16. Satellite-Based Surface Heat Budgets and Sea Surface Temperature Tendency in the Tropical Eastern Indian and Western Pacific Oceans for the 1997/98 El Nino and 1998/99 La Nina

    Science.gov (United States)

    Chou, Shu-Hsien; Chou, Ming-Dah; Chan, Pui-King; Lin, Po-Hsiung

    2002-01-01

    The 1997/98 is a strong El Nino warm event, while the 1998/99 is a moderate La Nina cold event. We have investigated surface heat budgets and sea surface temperature (SST) tendency for these two events in the tropical western Pacific and eastern Indian Oceans using satellite-retrieved surface radiative and turbulent fluxes. The radiative fluxes are taken from the Goddard Satellite-retrieved Surface Radiation Budget (GSSRB), derived from radiance measurements of the Japanese Geostationary Meteorological Satellite 5. The GSSRB covers the domain 40 deg S - 4 deg N, 90 deg E-17 deg W and a period from October 1997 to December 2000. The spatial resolution is 0.5 deg x 0.5 deg lat-long and the temporal resolution is 1 day. The turbulent fluxes are taken from Version 2 of the Goddard Satellite-based Surface Turbulent Fluxes (GSSTF-2). The GSSTF-2 has a spatial resolution of 1 deg x 1 deg lat-long over global Oceans and a temporal resolution of 1 day covering the period July 1987-December 2000. Daily turbulent fluxes are derived from the S S M (Special Sensor Microwave/Imager) surface wind and surface air humidity, and the SST and 2-m air temperature of the NCEP/NCAR reanalysis, using a stability-dependent bulk flux algorithm. The changes of surface heat budgets, SST and tendency, cloudiness, wind speed, and zonal wind stress of the 1997/98 El Nino relative to the1998/99 La Nina for the northern winter and spring seasons are analyzed. The relative changes of surface heat budgets and SST tendency of the two events are quite different between the tropical eastern Indian and western Pacific Oceans. For the tropical western Pacific, reduced solar heating (more clouds) is generally associated with decreased evaporative cooling (weaker winds), and vise versa. The changes in evaporative cooling over-compensate that of solar heating and dominate the spatial variability of the changes in net surface heating. Both solar heating and evaporative cooling offset each other to reduce

  17. Immediate error correction process following sleep deprivation.

    Science.gov (United States)

    Hsieh, Shulan; Cheng, I-Chen; Tsai, Ling-Ling

    2007-06-01

    Previous studies have suggested that one night of sleep deprivation decreases frontal lobe metabolic activity, particularly in the anterior cingulated cortex (ACC), resulting in decreased performance in various executive function tasks. This study thus attempted to address whether sleep deprivation impaired the executive function of error detection and error correction. Sixteen young healthy college students (seven women, nine men, with ages ranging from 18 to 23 years) participated in this study. Participants performed a modified letter flanker task and were instructed to make immediate error corrections on detecting performance errors. Event-related potentials (ERPs) during the flanker task were obtained using a within-subject, repeated-measure design. The error negativity or error-related negativity (Ne/ERN) and the error positivity (Pe) seen immediately after errors were analyzed. The results show that the amplitude of the Ne/ERN was reduced significantly following sleep deprivation. Reduction also occurred for error trials with subsequent correction, indicating that sleep deprivation influenced error correction ability. This study further demonstrated that the impairment in immediate error correction following sleep deprivation was confined to specific stimulus types, with both Ne/ERN and behavioral correction rates being reduced only for trials in which flanker stimuli were incongruent with the target stimulus, while the response to the target was compatible with that of the flanker stimuli following sleep deprivation. The results thus warrant future systematic investigation of the interaction between stimulus type and error correction following sleep deprivation.

  18. Nonperturbative QCD corrections to electroweak observables

    Energy Technology Data Exchange (ETDEWEB)

    Renner, Dru B. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Feng, Xu [High Energy Accelerator Research Organization (KEK), Tsukuba, Ibaraki (Japan); Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Petschlies, Marcus [The Cyprus Institute, Nicosia (Cyprus)

    2012-06-15

    Nonperturbative QCD corrections are important to many low-energy electroweak observables, for example the muon magnetic moment. However, hadronic corrections also play a significant role at much higher energies due to their impact on the running of standard model parameters, such as the electromagnetic coupling. Currently, these hadronic contributions are accounted for by a combination of experimental measurements, effective field theory techniques and phenomenological modeling but ideally should be calculated from first principles. Recent developments indicate that many of the most important hadronic corrections may be feasibly calculated using lattice QCD methods. To illustrate this, we examine the lattice computation of the leading-order QCD corrections to the muon magnetic moment, paying particular attention to a recently developed method but also reviewing the results from other calculations. We then continue with several examples that demonstrate the potential impact of the new approach: the leading-order corrections to the electron and tau magnetic moments, the running of the electromagnetic coupling, and a class of the next-to-leading-order corrections for the muon magnetic moment. Along the way, we mention applications to the Adler function, which can be used to determine the strong coupling constant, and QCD corrections to muonic-hydrogen.

  19. English Learners Perception on Lecturers’ Corrective Feedback

    Directory of Open Access Journals (Sweden)

    Titien Fatmawaty Mohammad

    2016-04-01

    Full Text Available The importance of written corrective feedback (CF has been an issue of substantial debate in the literature and this controversial issue has led to a development in latest studies to draw on foreign language acquisition (FLA research as a way to further comprehend the complexities of this issue particularly how students and teachers perceive the effectiveness of written corrective feedback. This research has largely focused on students’ perception on Lecturers’ corrective feedback, perceives the usefulness of different types of corrective feedback and the reasons they have for their preferences. Qualitative data was collected from 40 EFL students in 6th semester, by means of written questionnaires, interview and observation. Four feedback strategies were employed in this research and ranked each statement by using five-point Likert scale. Findings showed that almost all students 81.43 % want correction or feedback from lecturers for the mistakes on their writing. For the type of written corrective feedback, students prefer lecturers mark their mistakes and give comment on their work with the percentage as follows: 93% students found that giving clues or comment about how to fix errors can improve their writing ability, 76.69% of the students found that error identification is the most useful type of feedback, and 57.50% of students have a positive opinion for the provision of correction which is accompanied by comment. Those percentages of students perspective is supported by students’ explanation in an open ended question of questionnaire. Pedagogical implications of the study are also discussed.

  20. Nonperturbative QCD corrections to electroweak observables

    Energy Technology Data Exchange (ETDEWEB)

    Dru B Renner, Xu Feng, Karl Jansen, Marcus Petschlies

    2011-12-01

    Nonperturbative QCD corrections are important to many low-energy electroweak observables, for example the muon magnetic moment. However, hadronic corrections also play a significant role at much higher energies due to their impact on the running of standard model parameters, such as the electromagnetic coupling. Currently, these hadronic contributions are accounted for by a combination of experimental measurements and phenomenological modeling but ideally should be calculated from first principles. Recent developments indicate that many of the most important hadronic corrections may be feasibly calculated using lattice QCD methods. To illustrate this, we will examine the lattice computation of the leading-order QCD corrections to the muon magnetic moment, paying particular attention to a recently developed method but also reviewing the results from other calculations. We will then continue with several examples that demonstrate the potential impact of the new approach: the leading-order corrections to the electron and tau magnetic moments, the running of the electromagnetic coupling, and a class of the next-to-leading-order corrections for the muon magnetic moment. Along the way, we will mention applications to the Adler function, the determination of the strong coupling constant and QCD corrections to muonic-hydrogen.