WorldWideScience

Sample records for high-resolution geostatistical estimate

  1. Industrial experience feedback of a geostatistical estimation of contaminated soil volumes - 59181

    International Nuclear Information System (INIS)

    Faucheux, Claire; Jeannee, Nicolas

    2012-01-01

    Geo-statistics meets a growing interest for the remediation forecast of potentially contaminated sites, by providing adapted methods to perform both chemical and radiological pollution mapping, to estimate contaminated volumes, potentially integrating auxiliary information, and to set up adaptive sampling strategies. As part of demonstration studies carried out for GeoSiPol (Geo-statistics for Polluted Sites), geo-statistics has been applied for the detailed diagnosis of a former oil depot in France. The ability within the geo-statistical framework to generate pessimistic / probable / optimistic scenarios for the contaminated volumes allows a quantification of the risks associated to the remediation process: e.g. the financial risk to excavate clean soils, the sanitary risk to leave contaminated soils in place. After a first mapping, an iterative approach leads to collect additional samples in areas previously identified as highly uncertain. Estimated volumes are then updated and compared to the volumes actually excavated. This benchmarking therefore provides a practical feedback on the performance of the geo-statistical methodology. (authors)

  2. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    Science.gov (United States)

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  3. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    International Nuclear Information System (INIS)

    Rosen, L.; Gustafson, G.

    1996-01-01

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  4. Introduction to Geostatistics

    Science.gov (United States)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  5. Comparative study of the geostatistical ore reserve estimation method over the conventional methods

    International Nuclear Information System (INIS)

    Kim, Y.C.; Knudsen, H.P.

    1975-01-01

    Part I contains a comprehensive treatment of the comparative study of the geostatistical ore reserve estimation method over the conventional methods. The conventional methods chosen for comparison were: (a) the polygon method, (b) the inverse of the distance squared method, and (c) a method similar to (b) but allowing different weights in different directions. Briefly, the overall result from this comparative study is in favor of the use of geostatistics in most cases because the method has lived up to its theoretical claims. A good exposition on the theory of geostatistics, the adopted study procedures, conclusions and recommended future research are given in Part I. Part II of this report contains the results of the second and the third study objectives, which are to assess the potential benefits that can be derived by the introduction of the geostatistical method to the current state-of-the-art in uranium reserve estimation method and to be instrumental in generating the acceptance of the new method by practitioners through illustrative examples, assuming its superiority and practicality. These are given in the form of illustrative examples on the use of geostatistics and the accompanying computer program user's guide

  6. Estimating Rainfall in Rodrigues by Geostatistics: (A) Theory | Proag ...

    African Journals Online (AJOL)

    This paper introduces the geostatistical method. Originally devised to treat problems that arise when conventional statistical theory is used in estimating changes in ore grade within a mine, it is, however, an abstract theory of statistical behaviour that is applicable to many circumstances in different areas of geology and other ...

  7. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    Science.gov (United States)

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  8. A geostatistical estimation of zinc grade in bore-core samples

    International Nuclear Information System (INIS)

    Starzec, A.

    1987-01-01

    Possibilities and preliminary results of geostatistical interpretation of the XRF determination of zinc in bore-core samples are considered. For the spherical model of the variogram the estimation variance of grade in a disk-shape sample (estimated from the grade on the circumference sample) is calculated. Variograms of zinc grade in core samples are presented and examples of the grade estimation are discussed. 4 refs., 7 figs., 1 tab. (author)

  9. Geostatistical characterization of the Callovo-Oxfordian clay variability: from conventional and high resolution log data

    International Nuclear Information System (INIS)

    Lefranc, Marie

    2007-01-01

    Andra (National Radioactive Waste Management Agency) has conducted studies in its Meuse/Haute-Marne Underground Research Laboratory located at a depth of about 490 m in a 155-million-year-old argillaceous rock: the Callovo-Oxfordian argillite. The purpose of the present work is to obtain as much information as possible from high-resolution log data and to optimize their analysis to specify and characterize space-time variations of the argillites from the Meuse/Haute-Marne site and subsequently predict the evolution of argillite properties on a 250 km 2 zone around the underground laboratory (transposition zone). The aim is to outline a methodology to transform depth intervals into geological time intervals and thus to quantify precisely the sedimentation rate variation, estimate duration; for example the duration of bio-stratigraphical units or of hiatuses. The latter point is particularly important because a continuous time recording is often assumed in geological modelling. The spatial variations can be studied on various scales. First, well-to-well correlations are established between seven wells at different scales. Relative variations of the thickness are observed locally. Second, FMI (Full-bore Formation Micro-Imager, Schlumberger) data are studied in detail to extract as much information as possible. For example, the analysis of FMI images reveals a clear carbonate - clay inter-bedding which displays cycles. Third, geostatistical tools are used to study these cycles. The vario-graphic analysis of conventional log data shows one metre cycles. With FMI data, smaller periods can be detected. Variogram modelling and factorial kriging analysis suggest that three spatial periods exist. They vary vertically and laterally in the boreholes but cycle ratios are stable and similar to orbital-cycle ratios (Milankovitch cycles). The three periods correspond to eccentricity, obliquity and precession. Since the duration of these orbital cycles is known, depth intervals can

  10. MoisturEC: an R application for geostatistical estimation of moisture content from electrical conductivity data

    Science.gov (United States)

    Terry, N.; Day-Lewis, F. D.; Werkema, D. D.; Lane, J. W., Jr.

    2017-12-01

    Soil moisture is a critical parameter for agriculture, water supply, and management of landfills. Whereas direct data (as from TDR or soil moisture probes) provide localized point scale information, it is often more desirable to produce 2D and/or 3D estimates of soil moisture from noninvasive measurements. To this end, geophysical methods for indirectly assessing soil moisture have great potential, yet are limited in terms of quantitative interpretation due to uncertainty in petrophysical transformations and inherent limitations in resolution. Simple tools to produce soil moisture estimates from geophysical data are lacking. We present a new standalone program, MoisturEC, for estimating moisture content distributions from electrical conductivity data. The program uses an indicator kriging method within a geostatistical framework to incorporate hard data (as from moisture probes) and soft data (as from electrical resistivity imaging or electromagnetic induction) to produce estimates of moisture content and uncertainty. The program features data visualization and output options as well as a module for calibrating electrical conductivity with moisture content to improve estimates. The user-friendly program is written in R - a widely used, cross-platform, open source programming language that lends itself to further development and customization. We demonstrate use of the program with a numerical experiment as well as a controlled field irrigation experiment. Results produced from the combined geostatistical framework of MoisturEC show improved estimates of moisture content compared to those generated from individual datasets. This application provides a convenient and efficient means for integrating various data types and has broad utility to soil moisture monitoring in landfills, agriculture, and other problems.

  11. Application of geostatistics in Beach Placer

    International Nuclear Information System (INIS)

    Sundar, G.

    2016-01-01

    The goal of Geostatistics is in the prediction of possible spatial distribution of a property. Application of Geostatistics has gained significance in the field of exploration, evaluation and mining. In the case of beach and inland placer sands exploration, geostatistics can be used in optimising the drill hole spacing, estimate resources of the total heavy minerals (THM), estimation on different grid pattern and grade - tonnage curves. Steps involved in a geostatistical study are exploratory data analysis, creation of experimental variogram, variogram model fitting, kriging and cross validation. Basic tools in geostatistics are variogram and kriging. Characteristics of a variogram are sill, range and nugget. There is a necessity for variogram model fitting prior to kriging. Commonly used variogram models are spherical, exponential and gaussian

  12. Geostatistical risk estimation at waste disposal sites in the presence of hot spots

    International Nuclear Information System (INIS)

    Komnitsas, Kostas; Modis, Kostas

    2009-01-01

    The present paper aims to estimate risk by using geostatistics at the wider coal mining/waste disposal site of Belkovskaya, Tula region, in Russia. In this area the presence of hot spots causes a spatial trend in the mean value of the random field and a non-Gaussian data distribution. Prior to application of geostatistics, subtraction of trend and appropriate smoothing and transformation of the data into a Gaussian form were carried out; risk maps were then generated for the wider study area in order to assess the probability of exceeding risk thresholds. Finally, the present paper discusses the need for homogenization of soil risk thresholds regarding hazardous elements that will enhance reliability of risk estimation and enable application of appropriate rehabilitation actions in contaminated areas.

  13. A Reduced-Order Successive Linear Estimator for Geostatistical Inversion and its Application in Hydraulic Tomography

    Science.gov (United States)

    Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Zeng, Wenzhi; Zhang, Yonggen; Sun, Fangqiang; Shi, Liangsheng

    2018-03-01

    Hydraulic tomography (HT) is a recently developed technology for characterizing high-resolution, site-specific heterogeneity using hydraulic data (nd) from a series of cross-hole pumping tests. To properly account for the subsurface heterogeneity and to flexibly incorporate additional information, geostatistical inverse models, which permit a large number of spatially correlated unknowns (ny), are frequently used to interpret the collected data. However, the memory storage requirements for the covariance of the unknowns (ny × ny) in these models are prodigious for large-scale 3-D problems. Moreover, the sensitivity evaluation is often computationally intensive using traditional difference method (ny forward runs). Although employment of the adjoint method can reduce the cost to nd forward runs, the adjoint model requires intrusive coding effort. In order to resolve these issues, this paper presents a Reduced-Order Successive Linear Estimator (ROSLE) for analyzing HT data. This new estimator approximates the covariance of the unknowns using Karhunen-Loeve Expansion (KLE) truncated to nkl order, and it calculates the directional sensitivities (in the directions of nkl eigenvectors) to form the covariance and cross-covariance used in the Successive Linear Estimator (SLE). In addition, the covariance of unknowns is updated every iteration by updating the eigenvalues and eigenfunctions. The computational advantages of the proposed algorithm are demonstrated through numerical experiments and a 3-D transient HT analysis of data from a highly heterogeneous field site.

  14. Use of geostatistics in high level radioactive waste repository site characterization

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P G [Pacific Northwest Laboratory, Richland, WA (USA)

    1980-09-01

    In evaluating and characterizing sites that are candidates for use as repositories for high-level radioactive waste, there is an increasing need to estimate the uncertainty in hydrogeologic data and in the quantities calculated from them. This paper discusses the use of geostatistical techniques to estimate hydrogeologic surfaces, such as the top of a basalt formation, and to provide a measure of the uncertainty in that estimate. Maps of the uncertainty estimate, called the kriging error, can be used to evaluate where new data should be taken to affect the greatest reduction in uncertainty in the estimated surface. The methods are illustrated on a set of site-characterization data; the top-of-basalt elevations at the Hanford Site near Richland, Washington.

  15. Evaluation of geostatistical parameters based on well tests; Estimation de parametres geostatistiques a partir de tests de puits

    Energy Technology Data Exchange (ETDEWEB)

    Gauthier, Y.

    1997-10-20

    Geostatistical tools are increasingly used to model permeability fields in subsurface reservoirs, which are considered as a particular random variable development depending of several geostatistical parameters such as variance and correlation length. The first part of the thesis is devoted to the study of relations existing between the transient well pressure (the well test) and the stochastic permeability field, using the apparent permeability concept.The well test performs a moving permeability average over larger and larger volume with increasing time. In the second part, the geostatistical parameters are evaluated using well test data; a Bayesian framework is used and parameters are estimated using the maximum likelihood principle by maximizing the well test data probability density function with respect to these parameters. This method, involving a well test fast evaluation, provides an estimation of the correlation length and the variance over different realizations of a two-dimensional permeability field

  16. Deriving temporally continuous soil moisture estimations at fine resolution by downscaling remotely sensed product

    Science.gov (United States)

    Jin, Yan; Ge, Yong; Wang, Jianghao; Heuvelink, Gerard B. M.

    2018-06-01

    Land surface soil moisture (SSM) has important roles in the energy balance of the land surface and in the water cycle. Downscaling of coarse-resolution SSM remote sensing products is an efficient way for producing fine-resolution data. However, the downscaling methods used most widely require full-coverage visible/infrared satellite data as ancillary information. These methods are restricted to cloud-free days, making them unsuitable for continuous monitoring. The purpose of this study is to overcome this limitation to obtain temporally continuous fine-resolution SSM estimations. The local spatial heterogeneities of SSM and multiscale ancillary variables were considered in the downscaling process both to solve the problem of the strong variability of SSM and to benefit from the fusion of ancillary information. The generation of continuous downscaled remote sensing data was achieved via two principal steps. For cloud-free days, a stepwise hybrid geostatistical downscaling approach, based on geographically weighted area-to-area regression kriging (GWATARK), was employed by combining multiscale ancillary variables with passive microwave remote sensing data. Then, the GWATARK-estimated SSM and China Soil Moisture Dataset from Microwave Data Assimilation SSM data were combined to estimate fine-resolution data for cloudy days. The developed methodology was validated by application to the 25-km resolution daily AMSR-E SSM product to produce continuous SSM estimations at 1-km resolution over the Tibetan Plateau. In comparison with ground-based observations, the downscaled estimations showed correlation (R ≥ 0.7) for both ascending and descending overpasses. The analysis indicated the high potential of the proposed approach for producing a temporally continuous SSM product at fine spatial resolution.

  17. High Spatio-Temporal Resolution Bathymetry Estimation and Morphology

    Science.gov (United States)

    Bergsma, E. W. J.; Conley, D. C.; Davidson, M. A.; O'Hare, T. J.

    2015-12-01

    In recent years, bathymetry estimates using video images have become increasingly accurate. With the cBathy code (Holman et al., 2013) fully operational, bathymetry results with 0.5 metres accuracy have been regularly obtained at Duck, USA. cBathy is based on observations of the dominant frequencies and wavelengths of surface wave motions and estimates the depth (and hence allows inference of bathymetry profiles) based on linear wave theory. Despite the good performance at Duck, large discrepancies were found related to tidal elevation and camera height (Bergsma et al., 2014) and on the camera boundaries. A tide dependent floating pixel and camera boundary solution have been proposed to overcome these issues (Bergsma et al., under review). The video-data collection is set estimate depths hourly on a grid with resolution in the order of 10x25 meters. Here, the application of the cBathy at Porthtowan in the South-West of England is presented. Hourly depth estimates are combined and analysed over a period of 1.5 years (2013-2014). In this work the focus is on the sub-tidal region, where the best cBathy results are achieved. The morphology of the sub-tidal bar is tracked with high spatio-temporal resolution on short and longer time scales. Furthermore, the impact of the storm and reset (sudden and large changes in bathymetry) of the sub-tidal area is clearly captured with the depth estimations. This application shows that the high spatio-temporal resolution of cBathy makes it a powerful tool for coastal research and coastal zone management.

  18. The application of geostatistics in erosion hazard mapping

    NARCIS (Netherlands)

    Beurden, S.A.H.A. van; Riezebos, H.Th.

    1988-01-01

    Geostatistical interpolation or kriging of soil and vegetation variables has become an important alternative to other mapping techniques. Although a reconnaissance sampling is necessary and basic requirements of geostatistics have to be met, kriging has the advantage of giving estimates with a

  19. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    Science.gov (United States)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  20. Using river distance and existing hydrography data can improve the geostatistical estimation of fish tissue mercury at unsampled locations.

    Science.gov (United States)

    Money, Eric S; Sackett, Dana K; Aday, D Derek; Serre, Marc L

    2011-09-15

    Mercury in fish tissue is a major human health concern. Consumption of mercury-contaminated fish poses risks to the general population, including potentially serious developmental defects and neurological damage in young children. Therefore, it is important to accurately identify areas that have the potential for high levels of bioaccumulated mercury. However, due to time and resource constraints, it is difficult to adequately assess fish tissue mercury on a basin wide scale. We hypothesized that, given the nature of fish movement along streams, an analytical approach that takes into account distance traveled along these streams would improve the estimation accuracy for fish tissue mercury in unsampled streams. Therefore, we used a river-based Bayesian Maximum Entropy framework (river-BME) for modern space/time geostatistics to estimate fish tissue mercury at unsampled locations in the Cape Fear and Lumber Basins in eastern North Carolina. We also compared the space/time geostatistical estimation using river-BME to the more traditional Euclidean-based BME approach, with and without the inclusion of a secondary variable. Results showed that this river-based approach reduced the estimation error of fish tissue mercury by more than 13% and that the median estimate of fish tissue mercury exceeded the EPA action level of 0.3 ppm in more than 90% of river miles for the study domain.

  1. Geostatistical investigations of rock masses

    International Nuclear Information System (INIS)

    Matar, J.A.; Sarquis, M.A.; Girardi, J.P.; Tabbia, G.H.

    1987-01-01

    The geostatistical tehniques applied for the selection of a minimun fracturation volume in Sierra del Medio allow to quantify and qualify the variability of mechanic characteristics and density of fracture and also the level of reliability in estimations. The role of geostatistics is discussed in this work so as to select minimun fracturation blocks as a very important site selection step. The only variable used is the 'jointing density' so as to detect the principal fracture systems affecting the rocky massif. It was used on the semivariograms corresponding to the previously mentioned regionalized variables. The different results of fracturation are compared with the deep and shallow geological survey to obtain two and three dimensional models. The range of the geostatistical techniques to detect local geological phenomena such as faults is discussed. The variability model obtained from the borehole data computations is investigated taking as basis the vertical Columnar Model of Discontinuity (fractures) hypothesis derived from geological studies about spatial behaviour of the joint systems and from geostatistical interpretation. (Author) [es

  2. Multiobjective design of aquifer monitoring networks for optimal spatial prediction and geostatistical parameter estimation

    Science.gov (United States)

    Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.

    2013-06-01

    Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that

  3. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  4. Geo-statistical model of Rainfall erosivity by using high temporal resolution precipitation data in Europe

    Science.gov (United States)

    Panagos, Panos; Ballabio, Cristiano; Borrelli, Pasquale; Meusburger, Katrin; Alewell, Christine

    2015-04-01

    Rainfall erosivity (R-factor) is among the 6 input factors in estimating soil erosion risk by using the empirical Revised Universal Soil Loss Equation (RUSLE). R-factor is a driving force for soil erosion modelling and potentially can be used in flood risk assessments, landslides susceptibility, post-fire damage assessment, application of agricultural management practices and climate change modelling. The rainfall erosivity is extremely difficult to model at large scale (national, European) due to lack of high temporal resolution precipitation data which cover long-time series. In most cases, R-factor is estimated based on empirical equations which take into account precipitation volume. The Rainfall Erosivity Database on the European Scale (REDES) is the output of an extensive data collection of high resolution precipitation data in the 28 Member States of the European Union plus Switzerland taking place during 2013-2014 in collaboration with national meteorological/environmental services. Due to different temporal resolutions of the data (5, 10, 15, 30, 60 minutes), conversion equations have been applied in order to homogenise the database at 30-minutes interval. The 1,541 stations included in REDES have been interpolated using the Gaussian Process Regression (GPR) model using as covariates the climatic data (monthly precipitation, monthly temperature, wettest/driest month) from WorldClim Database, Digital Elevation Model and latitude/longitude. GPR has been selected among other candidate models (GAM, Regression Kriging) due the best performance both in cross validation (R2=0.63) and in fitting dataset (R2=0.72). The highest uncertainty has been noticed in North-western Scotland, North Sweden and Finland due to limited number of stations in REDES. Also, in highlands such as Alpine arch and Pyrenees the diversity of environmental features forced relatively high uncertainty. The rainfall erosivity map of Europe available at 500m resolution plus the standard error

  5. Geostatistics for Mapping Leaf Area Index over a Cropland Landscape: Efficiency Sampling Assessment

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Haro

    2010-11-01

    Full Text Available This paper evaluates the performance of spatial methods to estimate leaf area index (LAI fields from ground-based measurements at high-spatial resolution over a cropland landscape. Three geostatistical model variants of the kriging technique, the ordinary kriging (OK, the collocated cokriging (CKC and kriging with an external drift (KED are used. The study focused on the influence of the spatial sampling protocol, auxiliary information, and spatial resolution in the estimates. The main advantage of these models lies in the possibility of considering the spatial dependence of the data and, in the case of the KED and CKC, the auxiliary information for each location used for prediction purposes. A high-resolution NDVI image computed from SPOT TOA reflectance data is used as an auxiliary variable in LAI predictions. The CKC and KED predictions have proven the relevance of the auxiliary information to reproduce the spatial pattern at local scales, proving the KED model to be the best estimator when a non-stationary trend is observed. Advantages and limitations of the methods in LAI field predictions for two systematic and two stratified spatial samplings are discussed for high (20 m, medium (300 m and coarse (1 km spatial scales. The KED has exhibited the best observed local accuracy for all the spatial samplings. Meanwhile, the OK model provides comparable results when a well stratified sampling scheme is considered by land cover.

  6. Geostatistics for radiological characterization: overview and application cases

    International Nuclear Information System (INIS)

    Desnoyers, Yvon

    2016-01-01

    The objective of radiological characterization is to find a suitable balance between gathering data (constrained by cost, deadlines, accessibility or radiation) and managing the issues (waste volumes, levels of activity or exposure). It is necessary to have enough information to have confidence in the results without multiplying useless data. Geo-statistics processing of data considers all available pieces of information: historical data, non-destructive measurements and laboratory analyses of samples. The spatial structure modelling is then used to produce maps and to estimate the extent of radioactive contamination (surface and depth). Quantifications of local and global uncertainties are powerful decision-making tools for better management of remediation projects at contaminated sites, and for decontamination and dismantling projects at nuclear facilities. They can be used to identify hot spots, estimate contamination of surfaces and volumes, classify radioactive waste according to thresholds, estimate source terms, and so on. The spatial structure of radioactive contamination makes the optimization of sampling (number and position of data points) particularly important. Geo-statistics methodology can help determine the initial mesh size and reduce estimation uncertainties. Several show cases are presented to illustrate why and how geo-statistics can be applied to a range of radiological characterization where investigated units can represent very small areas (a few m 2 or a few m 3 ) or very large sites (at a country scale). The focus is then put on experience gained over years in the use of geo-statistics and sampling optimization. (author)

  7. Application of Geostatistical Modelling to Study the Exploration Adequacy of Uniaxial Compressive Strength of Intact Rock alongthe Behesht-Abad Tunnel Route

    Directory of Open Access Journals (Sweden)

    Mohammad Doustmohammadi

    2014-12-01

    Full Text Available Uniaxial compressive strength (UCS is one of the most significant factors on the stability of underground excavation projects. Most of the time, this factor can be obtained by exploratory boreholes evaluation. Due to the large distance between exploratory boreholes in the majority of geotechnical projects, the application of geostatistical methods has increased as an estimator of rock mass properties. The present paper ties the estimation of UCS values of intact rock to the distance between boreholes of the Behesht-Abad tunnel in central Iran, using SGEMS geostatistical program. Variography showed that UCS estimation of intact rock using geostatistical methods is reasonable. The model establishment and validation was done after assessment that the model was trustworthy. Cross validation proved the high accuracy (98% and reliability of the model to estimate uniaxial compressive strength. The UCS values were then estimated along the tunnel axis. Moreover, using geostatistical estimation led to better identification of the pros and cons of geotechnical explorations in each location of tunnel route.

  8. Geostatistical ore reserve estimation for a roll-front type uranium deposit (practitioner's guide)

    International Nuclear Information System (INIS)

    Kim, Y.C.; Knudsen, H.P.

    1977-01-01

    This report comprises two parts. Part I contains illustrative examples of each phase of a geostatistical study using a roll-front type uranium deposit. Part II contains five computer programs and comprehensive users' manuals for these programs which are necessary to make a practical geostatistical study

  9. Use of geostatistics for remediation planning to transcend urban political boundaries

    International Nuclear Information System (INIS)

    Milillo, Tammy M.; Sinha, Gaurav; Gardella, Joseph A.

    2012-01-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial–residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. - Highlights: ► Point samples and property boundaries do not appropriately determine the extent of soil contamination. ► Kriging and co-kriging provide best concentration estimates for mapping soil contamination and refining clean-up sites. ► Maps provide a visual representation of geostatistical results to communities to aid in geostatistical decision making. ► Incorporating community input into the assessment of neighborhoods is good public policy practice. - Using geostatistical interpolation and mapping results to involve the affected community can substantially improve remediation planning and promote its long-term effectiveness.

  10. Geostatistical Model-Based Estimates of Schistosomiasis Prevalence among Individuals Aged ≤20 Years in West Africa

    Science.gov (United States)

    Schur, Nadine; Hürlimann, Eveline; Garba, Amadou; Traoré, Mamadou S.; Ndir, Omar; Ratard, Raoult C.; Tchuem Tchuenté, Louis-Albert; Kristensen, Thomas K.; Utzinger, Jürg; Vounatsou, Penelope

    2011-01-01

    present the first empirical estimates for S. mansoni and S. haematobium prevalence at high spatial resolution throughout West Africa. Our prediction maps allow prioritizing of interventions in a spatially explicit manner, and will be useful for monitoring and evaluation of schistosomiasis control programs. PMID:21695107

  11. High-resolution gulf water skin temperature estimation using TIR/ASTER

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.; ManiMurali, R.; Mahender, K.

    to separate geomorphic features. It is demonstrated that high resolution water skin temperature of small water bodies can be determined correctly, economically and less laboriously using space-based TIR/ASTER and that estimated temperature can be effectively...

  12. Application of Artificial Neural Networks for Efficient High-Resolution 2D DOA Estimation

    Directory of Open Access Journals (Sweden)

    M. Agatonović

    2012-12-01

    Full Text Available A novel method to provide high-resolution Two-Dimensional Direction of Arrival (2D DOA estimation employing Artificial Neural Networks (ANNs is presented in this paper. The observed space is divided into azimuth and elevation sectors. Multilayer Perceptron (MLP neural networks are employed to detect the presence of a source in a sector while Radial Basis Function (RBF neural networks are utilized for DOA estimation. It is shown that a number of appropriately trained neural networks can be successfully used for the high-resolution DOA estimation of narrowband sources in both azimuth and elevation. The training time of each smaller network is significantly re¬duced as different training sets are used for networks in detection and estimation stage. By avoiding the spectral search, the proposed method is suitable for real-time ap¬plications as it provides DOA estimates in a matter of seconds. At the same time, it demonstrates the accuracy comparable to that of the super-resolution 2D MUSIC algorithm.

  13. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    Science.gov (United States)

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.

  14. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    International Nuclear Information System (INIS)

    Quinn, J.J.

    1996-01-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi 2 study area, including 57 monitoring wells within an area of concern of 1.5 mi 2 . Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data

  15. Demonstration of a geostatistical approach to physically consistent downscaling of climate modeling simulations

    KAUST Repository

    Jha, Sanjeev Kumar; Mariethoz, Gregoire; Evans, Jason P.; McCabe, Matthew

    2013-01-01

    A downscaling approach based on multiple-point geostatistics (MPS) is presented. The key concept underlying MPS is to sample spatial patterns from within training images, which can then be used in characterizing the relationship between different variables across multiple scales. The approach is used here to downscale climate variables including skin surface temperature (TSK), soil moisture (SMOIS), and latent heat flux (LH). The performance of the approach is assessed by applying it to data derived from a regional climate model of the Murray-Darling basin in southeast Australia, using model outputs at two spatial resolutions of 50 and 10 km. The data used in this study cover the period from 1985 to 2006, with 1985 to 2005 used for generating the training images that define the relationships of the variables across the different spatial scales. Subsequently, the spatial distributions for the variables in the year 2006 are determined at 10 km resolution using the 50 km resolution data as input. The MPS geostatistical downscaling approach reproduces the spatial distribution of TSK, SMOIS, and LH at 10 km resolution with the correct spatial patterns over different seasons, while providing uncertainty estimates through the use of multiple realizations. The technique has the potential to not only bridge issues of spatial resolution in regional and global climate model simulations but also in feature sharpening in remote sensing applications through image fusion, filling gaps in spatial data, evaluating downscaled variables with available remote sensing images, and aggregating/disaggregating hydrological and groundwater variables for catchment studies.

  16. Estimating NOx emissions and surface concentrations at high spatial resolution using OMI

    Science.gov (United States)

    Goldberg, D. L.; Lamsal, L. N.; Loughner, C.; Swartz, W. H.; Saide, P. E.; Carmichael, G. R.; Henze, D. K.; Lu, Z.; Streets, D. G.

    2017-12-01

    In many instances, NOx emissions are not measured at the source. In these cases, remote sensing techniques are extremely useful in quantifying NOx emissions. Using an exponential modified Gaussian (EMG) fitting of oversampled Ozone Monitoring Instrument (OMI) NO2 data, we estimate NOx emissions and lifetimes in regions where these emissions are uncertain. This work also presents a new high-resolution OMI NO2 dataset derived from the NASA retrieval that can be used to estimate surface level concentrations in the eastern United States and South Korea. To better estimate vertical profile shape factors, we use high-resolution model simulations (Community Multi-scale Air Quality (CMAQ) and WRF-Chem) constrained by in situ aircraft observations to re-calculate tropospheric air mass factors and tropospheric NO2 vertical columns during summertime. The correlation between our satellite product and ground NO2 monitors in urban areas has improved dramatically: r2 = 0.60 in new product, r2 = 0.39 in operational product, signifying that this new product is a better indicator of surface concentrations than the operational product. Our work emphasizes the need to use both high-resolution and high-fidelity models in order to re-calculate vertical column data in areas with large spatial heterogeneities in NOx emissions. The methodologies developed in this work can be applied to other world regions and other satellite data sets to produce high-quality region-specific emissions estimates.

  17. A practical primer on geostatistics

    Science.gov (United States)

    Olea, Ricardo A.

    2009-01-01

    The Challenge—Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer.The Aim of Geostatistics—The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps.Geostatistical Methods—Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods.Differences with Time Series—On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference

  18. Assessment and modeling of the groundwater hydrogeochemical quality parameters via geostatistical approaches

    Science.gov (United States)

    Karami, Shawgar; Madani, Hassan; Katibeh, Homayoon; Fatehi Marj, Ahmad

    2018-03-01

    Geostatistical methods are one of the advanced techniques used for interpolation of groundwater quality data. The results obtained from geostatistics will be useful for decision makers to adopt suitable remedial measures to protect the quality of groundwater sources. Data used in this study were collected from 78 wells in Varamin plain aquifer located in southeast of Tehran, Iran, in 2013. Ordinary kriging method was used in this study to evaluate groundwater quality parameters. According to what has been mentioned in this paper, seven main quality parameters (i.e. total dissolved solids (TDS), sodium adsorption ratio (SAR), electrical conductivity (EC), sodium (Na+), total hardness (TH), chloride (Cl-) and sulfate (SO4 2-)), have been analyzed and interpreted by statistical and geostatistical methods. After data normalization by Nscore method in WinGslib software, variography as a geostatistical tool to define spatial regression was compiled and experimental variograms were plotted by GS+ software. Then, the best theoretical model was fitted to each variogram based on the minimum RSS. Cross validation method was used to determine the accuracy of the estimated data. Eventually, estimation maps of groundwater quality were prepared in WinGslib software and estimation variance map and estimation error map were presented to evaluate the quality of estimation in each estimated point. Results showed that kriging method is more accurate than the traditional interpolation methods.

  19. Estimating the burden of malaria in Senegal: Bayesian zero-inflated binomial geostatistical modeling of the MIS 2008 data.

    Directory of Open Access Journals (Sweden)

    Federica Giardina

    Full Text Available The Research Center for Human Development in Dakar (CRDH with the technical assistance of ICF Macro and the National Malaria Control Programme (NMCP conducted in 2008/2009 the Senegal Malaria Indicator Survey (SMIS, the first nationally representative household survey collecting parasitological data and malaria-related indicators. In this paper, we present spatially explicit parasitaemia risk estimates and number of infected children below 5 years. Geostatistical Zero-Inflated Binomial models (ZIB were developed to take into account the large number of zero-prevalence survey locations (70% in the data. Bayesian variable selection methods were incorporated within a geostatistical framework in order to choose the best set of environmental and climatic covariates associated with the parasitaemia risk. Model validation confirmed that the ZIB model had a better predictive ability than the standard Binomial analogue. Markov chain Monte Carlo (MCMC methods were used for inference. Several insecticide treated nets (ITN coverage indicators were calculated to assess the effectiveness of interventions. After adjusting for climatic and socio-economic factors, the presence of at least one ITN per every two household members and living in urban areas reduced the odds of parasitaemia by 86% and 81% respectively. Posterior estimates of the ORs related to the wealth index show a decreasing trend with the quintiles. Infection odds appear to be increasing with age. The population-adjusted prevalence ranges from 0.12% in Thillé-Boubacar to 13.1% in Dabo. Tambacounda has the highest population-adjusted predicted prevalence (8.08% whereas the region with the highest estimated number of infected children under the age of 5 years is Kolda (13940. The contemporary map and estimates of malaria burden identify the priority areas for future control interventions and provide baseline information for monitoring and evaluation. Zero-Inflated formulations are more appropriate

  20. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  1. Geostatistical methodology for waste optimization of contaminated premises - 59344

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    The presented methodological study illustrates a Geo-statistical approach suitable for radiological evaluation in nuclear premises. The waste characterization is mainly focused on floor concrete surfaces. By modeling the spatial continuity of activities, Geo-statistics provide sound methods to estimate and map radiological activities, together with their uncertainty. The multivariate approach allows the integration of numerous surface radiation measurements in order to improve the estimation of activity levels from concrete samples. This way, a sequential and iterative investigation strategy proves to be relevant to fulfill the different evaluation objectives. Waste characterization is performed on risk maps rather than on direct interpolation maps (due to bias of the selection on kriging results). The use of several estimation supports (punctual, 1 m 2 , room) allows a relevant radiological waste categorization thanks to cost-benefit analysis according to the risk of exceeding a given activity threshold. Global results, mainly total activity, are similarly quantified to precociously lead the waste management for the dismantling and decommissioning project. This paper recalled the geo-statistics principles and demonstrated how this methodology provides innovative tools for the radiological evaluation of contaminated premises. The relevance of this approach relies on the presence of a spatial continuity for radiological contamination. In this case, geo-statistics provides reliable activity estimates, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Waste characterization is then performed taking all relevant information into account: historical knowledge, surface measurements and samples. Thanks to the multivariate processing, the different investigation stages can be rationalized as regards quantity and positioning. Waste characterization is finally

  2. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    Science.gov (United States)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  3. Geostatistics applied to estimation of uranium bearing ore reserves

    International Nuclear Information System (INIS)

    Urbina Galan, L.I.

    1982-01-01

    A computer assisted method for assessing uranium-bearing ore deposit reserves is analyzed. Determinations of quality-thickness, namely quality by thickness calculations of mineralization, were obtained by means of a mathematical method known as the theory of rational variables for each drill-hole layer. Geostatistical results were derived based on a Fortrand computer program on a DEC 20/40 system. (author)

  4. High-resolution model for estimating the economic and policy implications of agricultural soil salinization in California

    Science.gov (United States)

    Welle, Paul D.; Mauter, Meagan S.

    2017-09-01

    This work introduces a generalizable approach for estimating the field-scale agricultural yield losses due to soil salinization. When integrated with regional data on crop yields and prices, this model provides high-resolution estimates for revenue losses over large agricultural regions. These methods account for the uncertainty inherent in model inputs derived from satellites, experimental field data, and interpreted model results. We apply this method to estimate the effect of soil salinity on agricultural outputs in California, performing the analysis with both high-resolution (i.e. field scale) and low-resolution (i.e. county-scale) data sources to highlight the importance of spatial resolution in agricultural analysis. We estimate that soil salinity reduced agricultural revenues by 3.7 billion (1.7-7.0 billion) in 2014, amounting to 8.0 million tons of lost production relative to soil salinities below the crop-specific thresholds. When using low-resolution data sources, we find that the costs of salinization are underestimated by a factor of three. These results highlight the need for high-resolution data in agro-environmental assessment as well as the challenges associated with their integration.

  5. Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.

    Science.gov (United States)

    Sukop, Michael C; Cunningham, Kevin J

    2016-03-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. © 2015, National Ground Water Association.

  6. Geostatistics - bloodhound of uranium exploration

    International Nuclear Information System (INIS)

    David, Michel

    1979-01-01

    Geostatistics makes possible the efficient use of the information contained in core samples obtained by diamond drilling. The probability that a core represents the true content of a deposit, and the likely content of an orebody between two core samples can both be estimated using geostatistical methods. A confidence interval can be given for the mean grade of a deposit. The use of a computer is essential in the calculation of the continuity function, the variogram, when as many as 800,000 core samples may be involved. The results may be used to determine where additional samples need to be taken, and to develop a picture of the probable grades throughout the deposit. The basic mathematical model is about 15 years old, but applications to different types of deposit require various adaptations. The Ecole Polytechnique is currently developing methods for uranium deposits. (LL)

  7. 4th International Geostatistics Congress

    CERN Document Server

    1993-01-01

    The contributions in this book were presented at the Fourth International Geostatistics Congress held in Tróia, Portugal, in September 1992. They provide a comprehensive account of the current state of the art of geostatistics, including recent theoretical developments and new applications. In particular, readers will find descriptions and applications of the more recent methods of stochastic simulation together with data integration techniques applied to the modelling of hydrocabon reservoirs. In other fields there are stationary and non-stationary geostatistical applications to geology, climatology, pollution control, soil science, hydrology and human sciences. The papers also provide an insight into new trends in geostatistics particularly the increasing interaction with many other scientific disciplines. This book is a significant reference work for practitioners of geostatistics both in academia and industry.

  8. Geostatistical estimation of forest biomass in interior Alaska combining Landsat-derived tree cover, sampled airborne lidar and field observations

    Science.gov (United States)

    Babcock, Chad; Finley, Andrew O.; Andersen, Hans-Erik; Pattison, Robert; Cook, Bruce D.; Morton, Douglas C.; Alonzo, Michael; Nelson, Ross; Gregoire, Timothy; Ene, Liviu; Gobakken, Terje; Næsset, Erik

    2018-06-01

    The goal of this research was to develop and examine the performance of a geostatistical coregionalization modeling approach for combining field inventory measurements, strip samples of airborne lidar and Landsat-based remote sensing data products to predict aboveground biomass (AGB) in interior Alaska's Tanana Valley. The proposed modeling strategy facilitates pixel-level mapping of AGB density predictions across the entire spatial domain. Additionally, the coregionalization framework allows for statistically sound estimation of total AGB for arbitrary areal units within the study area---a key advance to support diverse management objectives in interior Alaska. This research focuses on appropriate characterization of prediction uncertainty in the form of posterior predictive coverage intervals and standard deviations. Using the framework detailed here, it is possible to quantify estimation uncertainty for any spatial extent, ranging from pixel-level predictions of AGB density to estimates of AGB stocks for the full domain. The lidar-informed coregionalization models consistently outperformed their counterpart lidar-free models in terms of point-level predictive performance and total AGB precision. Additionally, the inclusion of Landsat-derived forest cover as a covariate further improved estimation precision in regions with lower lidar sampling intensity. Our findings also demonstrate that model-based approaches that do not explicitly account for residual spatial dependence can grossly underestimate uncertainty, resulting in falsely precise estimates of AGB. On the other hand, in a geostatistical setting, residual spatial structure can be modeled within a Bayesian hierarchical framework to obtain statistically defensible assessments of uncertainty for AGB estimates.

  9. 7th International Geostatistics Congress

    CERN Document Server

    Deutsch, Clayton

    2005-01-01

    The conference proceedings consist of approximately 120 technical papers presented at the Seventh International Geostatistics Congress held in Banff, Alberta, Canada in 2004. All the papers were reviewed by an international panel of leading geostatisticians. The five major sections are: theory, mining, petroleum, environmental and other applications. The first section showcases new and innovative ideas in the theoretical development of geostatistics as a whole; these ideas will have large impact on (1) the directions of future geostatistical research, and (2) the conventional approaches to heterogeneity modelling in a wide range of natural resource industries. The next four sections are focused on applications and innovations relating to the use of geostatistics in specific industries. Historically, mining, petroleum and environmental industries have embraced the use of geostatistics for uncertainty characterization, so these three industries are identified as major application areas. The last section is open...

  10. Breast density estimation from high spectral and spatial resolution MRI

    Science.gov (United States)

    Li, Hui; Weiss, William A.; Medved, Milica; Abe, Hiroyuki; Newstead, Gillian M.; Karczmar, Gregory S.; Giger, Maryellen L.

    2016-01-01

    Abstract. A three-dimensional breast density estimation method is presented for high spectral and spatial resolution (HiSS) MR imaging. Twenty-two patients were recruited (under an Institutional Review Board--approved Health Insurance Portability and Accountability Act-compliant protocol) for high-risk breast cancer screening. Each patient received standard-of-care clinical digital x-ray mammograms and MR scans, as well as HiSS scans. The algorithm for breast density estimation includes breast mask generating, breast skin removal, and breast percentage density calculation. The inter- and intra-user variabilities of the HiSS-based density estimation were determined using correlation analysis and limits of agreement. Correlation analysis was also performed between the HiSS-based density estimation and radiologists’ breast imaging-reporting and data system (BI-RADS) density ratings. A correlation coefficient of 0.91 (pdensity estimations. An interclass correlation coefficient of 0.99 (pdensity estimations. A moderate correlation coefficient of 0.55 (p=0.0076) was observed between HiSS-based breast density estimations and radiologists’ BI-RADS. In summary, an objective density estimation method using HiSS spectral data from breast MRI was developed. The high reproducibility with low inter- and low intra-user variabilities shown in this preliminary study suggest that such a HiSS-based density metric may be potentially beneficial in programs requiring breast density such as in breast cancer risk assessment and monitoring effects of therapy. PMID:28042590

  11. Bayesian geostatistical modeling of leishmaniasis incidence in Brazil.

    Directory of Open Access Journals (Sweden)

    Dimitrios-Alexios Karagiannis-Voules

    Full Text Available BACKGROUND: Leishmaniasis is endemic in 98 countries with an estimated 350 million people at risk and approximately 2 million cases annually. Brazil is one of the most severely affected countries. METHODOLOGY: We applied Bayesian geostatistical negative binomial models to analyze reported incidence data of cutaneous and visceral leishmaniasis in Brazil covering a 10-year period (2001-2010. Particular emphasis was placed on spatial and temporal patterns. The models were fitted using integrated nested Laplace approximations to perform fast approximate Bayesian inference. Bayesian variable selection was employed to determine the most important climatic, environmental, and socioeconomic predictors of cutaneous and visceral leishmaniasis. PRINCIPAL FINDINGS: For both types of leishmaniasis, precipitation and socioeconomic proxies were identified as important risk factors. The predicted number of cases in 2010 were 30,189 (standard deviation [SD]: 7,676 for cutaneous leishmaniasis and 4,889 (SD: 288 for visceral leishmaniasis. Our risk maps predicted the highest numbers of infected people in the states of Minas Gerais and Pará for visceral and cutaneous leishmaniasis, respectively. CONCLUSIONS/SIGNIFICANCE: Our spatially explicit, high-resolution incidence maps identified priority areas where leishmaniasis control efforts should be targeted with the ultimate goal to reduce disease incidence.

  12. Geostatistical models for air pollution

    International Nuclear Information System (INIS)

    Pereira, M.J.; Soares, A.; Almeida, J.; Branquinho, C.

    2000-01-01

    The objective of this paper is to present geostatistical models applied to the spatial characterisation of air pollution phenomena. A concise presentation of the geostatistical methodologies is illustrated with practical examples. The case study was conducted in an underground copper-mine located on the southern of Portugal, where a biomonitoring program using lichens has been implemented. Given the characteristics of lichens as indicators of air pollution it was possible to gather a great amount of data in space, which enabled the development and application of geostatistical methodologies. The advantages of using geostatistical models compared with deterministic models, as environmental control tools, are highlighted. (author)

  13. Estimating Vegetation Rainfall Interception Using Remote Sensing Observations at Very High Resolution

    Science.gov (United States)

    Cui, Y.; Zhao, P.; Hong, Y.; Fan, W.; Yan, B.; Xie, H.

    2017-12-01

    Abstract: As an important compont of evapotranspiration, vegetation rainfall interception is the proportion of gross rainfall that is intercepted, stored and subsequently evaporated from all parts of vegetation during or following rainfall. Accurately quantifying the vegetation rainfall interception at a high resolution is critical for rainfall-runoff modeling and flood forecasting, and is also essential for understanding its further impact on local, regional, and even global water cycle dynamics. In this study, the Remote Sensing-based Gash model (RS-Gash model) is developed based on a modified Gash model for interception loss estimation using remote sensing observations at the regional scale, and has been applied and validated in the upper reach of the Heihe River Basin of China for different types of vegetation. To eliminate the scale error and the effect of mixed pixels, the RS-Gash model is applied at a fine scale of 30 m with the high resolution vegetation area index retrieved by using the unified model of bidirectional reflectance distribution function (BRDF-U) for the vegetation canopy. Field validation shows that the RMSE and R2 of the interception ratio are 3.7% and 0.9, respectively, indicating the model's strong stability and reliability at fine scale. The temporal variation of vegetation rainfall interception loss and its relationship with precipitation are further investigated. In summary, the RS-Gash model has demonstrated its effectiveness and reliability in estimating vegetation rainfall interception. When compared to the coarse resolution results, the application of this model at 30-m fine resolution is necessary to resolve the scaling issues as shown in this study. Keywords: rainfall interception; remote sensing; RS-Gash analytical model; high resolution

  14. A geostatistical analysis of geostatistics

    NARCIS (Netherlands)

    Hengl, T.; Minasny, B.; Gould, M.

    2009-01-01

    The bibliometric indices of the scientific field of geostatistics were analyzed using statistical and spatial data analysis. The publications and their citation statistics were obtained from the Web of Science (4000 most relevant), Scopus (2000 most relevant) and Google Scholar (5389). The focus was

  15. 10th International Geostatistics Congress

    CERN Document Server

    Rodrigo-Ilarri, Javier; Rodrigo-Clavero, María; Cassiraga, Eduardo; Vargas-Guzmán, José

    2017-01-01

    This book contains selected contributions presented at the 10th International Geostatistics Congress held in Valencia from 5 to 9 September, 2016. This is a quadrennial congress that serves as the meeting point for any engineer, professional, practitioner or scientist working in geostatistics. The book contains carefully reviewed papers on geostatistical theory and applications in fields such as mining engineering, petroleum engineering, environmental science, hydrology, ecology, and other fields.

  16. Estimating the number of cases of podoconiosis in Ethiopia using geostatistical methods [version 2; referees: 3 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Kebede Deribe

    2017-12-01

    Full Text Available Background: In 2011, the World Health Organization recognized podoconiosis as one of the neglected tropical diseases. Nonetheless, the  magnitude of podoconiosis and the geographical distribution of the disease is poorly understood. Based on a nationwide mapping survey and geostatistical modelling, we predict the prevalence of podoconiosis and estimate the number of cases across Ethiopia. Methods: We used nationwide data collected in Ethiopia between 2008 and 2013. Data were available for 141,238 individuals from 1,442 communities in 775 districts from all nine regional states and two city administrations. We developed a geostatistical model of podoconiosis prevalence among adults (individuals aged 15 years or above, by combining environmental factors. The number of people with podoconiosis was then estimated using a gridded map of adult population density for 2015. Results: Podoconiosis is endemic in 345 districts in Ethiopia: 144 in Oromia, 128 in Southern Nations, Nationalities and People’s [SNNP], 64 in Amhara, 4 in Benishangul Gumuz, 4 in Tigray and 1 in Somali Regional State. Nationally, our estimates suggest that 1,537,963 adults (95% confidence intervals, 290,923-4,577,031 adults were living with podoconiosis in 2015. Three regions (SNNP, Oromia and Amhara contributed 99% of the cases. The highest proportion of individuals with podoconiosis resided in the SNNP (39%, while 32% and 29% of people with podoconiosis resided in Oromia and Amhara Regional States, respectively. Tigray and Benishangul Gumuz Regional States bore lower burdens, and in the remaining regions, podoconiosis was almost non-existent.  Conclusions: The estimates of podoconiosis cases presented here based upon the combination of currently available epidemiological data and a robust modelling approach clearly show that podoconiosis is highly endemic in Ethiopia. Given the presence of low cost prevention, and morbidity management and disability prevention services, it is

  17. A space and time scale-dependent nonlinear geostatistical approach for downscaling daily precipitation and temperature

    KAUST Repository

    Jha, Sanjeev Kumar

    2015-07-21

    A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here, the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 km and 10 km resolution for a twenty year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference dataset indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local scale estimates of precipitation and temperature from General Circulation Models. This article is protected by copyright. All rights reserved.

  18. High resolution estimates of the corrosion risk for cultural heritage in Italy.

    Science.gov (United States)

    De Marco, Alessandra; Screpanti, Augusto; Mircea, Mihaela; Piersanti, Antonio; Proietti, Chiara; Fornasier, M Francesca

    2017-07-01

    Air pollution plays a pivotal role in the deterioration of many materials used in buildings and cultural monuments causing an inestimable damage. This study aims to estimate the impacts of air pollution (SO 2 , HNO 3 , O 3 , PM 10 ) and meteorological conditions (temperature, precipitation, relative humidity) on limestone, copper and bronze based on high resolution air quality data-base produced with AMS-MINNI modelling system over the Italian territory over the time period 2003-2010. A comparison between high resolution data (AMS-MINNI grid, 4 × 4 km) and low resolution data (EMEP grid, 50 × 50 km) has been performed. Our results pointed out that the corrosion levels for limestone, copper and bronze are decreased in Italy from 2003 to 2010 in relation to decrease of pollutant concentrations. However, some problem related to air pollution persists especially in Northern and Southern Italy. In particular, PM 10 and HNO 3 are considered the main responsible for limestone corrosion. Moreover, the high resolution data (AMS-MINNI) allowed the identification of risk areas that are not visible with the low resolution data (EMEP modelling system) in all considered years and, especially, in the limestone case. Consequently, high resolution air quality simulations are suitable to provide concrete benefits in providing information for national effective policy against corrosion risk for cultural heritage, also in the context of climate changes that are affecting strongly Mediterranean basin. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Laser radar cross-section estimation from high-resolution image data.

    Science.gov (United States)

    Osche, G R; Seeber, K N; Lok, Y F; Young, D S

    1992-05-10

    A methodology for the estimation of ladar cross sections from high-resolution image data of geometrically complex targets is presented. Coherent CO(2) laser radar was used to generate high-resolution amplitude imagery of a UC-8 Buffalo test aircraft at a range of 1.3 km at nine different aspect angles. The average target ladar cross section was synthesized from these data and calculated to be sigma(T) = 15.4 dBsm, which is similar to the expected microwave radar cross sections. The aspect angle dependence of the cross section shows pronounced peaks at nose on and broadside, which are also in agreement with radar results. Strong variations in both the mean amplitude and the statistical distributions of amplitude with the aspect angle have also been observed. The relative mix of diffuse and specular returns causes significant deviations from a simple Lambertian or Swerling II target, especially at broadside where large normal surfaces are present.

  20. Reference crop evapotranspiration estimate using high-resolution meteorological network's data

    Directory of Open Access Journals (Sweden)

    C. Lussana

    2009-10-01

    Full Text Available Water management authorities need detailed information about each component of the hydrological balance. This document presents a method to estimate the evapotranspiration rate, initialized in order to obtain the reference crop evapotranspiration rate (ET0. By using an Optimal Interpolation (OI scheme, the hourly observations of several meteorological variables, measured by a high-resolution local meteorological network, are interpolated over a regular grid. The analysed meteorological fields, containing detailed meteorological information, enter a model for turbulent heat fluxes estimation based on Monin-Obukhov surface layer similarity theory. The obtained ET0 fields are then post-processed and disseminated to the users.

  1. High resolution estimates of the corrosion risk for cultural heritage in Italy

    International Nuclear Information System (INIS)

    De Marco, Alessandra; Screpanti, Augusto; Mircea, Mihaela; Piersanti, Antonio; Proietti, Chiara; Fornasier, M. Francesca

    2017-01-01

    Air pollution plays a pivotal role in the deterioration of many materials used in buildings and cultural monuments causing an inestimable damage. This study aims to estimate the impacts of air pollution (SO 2 , HNO 3 , O 3 , PM 10 ) and meteorological conditions (temperature, precipitation, relative humidity) on limestone, copper and bronze based on high resolution air quality data-base produced with AMS-MINNI modelling system over the Italian territory over the time period 2003–2010. A comparison between high resolution data (AMS-MINNI grid, 4 × 4 km) and low resolution data (EMEP grid, 50 × 50 km) has been performed. Our results pointed out that the corrosion levels for limestone, copper and bronze are decreased in Italy from 2003 to 2010 in relation to decrease of pollutant concentrations. However, some problem related to air pollution persists especially in Northern and Southern Italy. In particular, PM 10 and HNO 3 are considered the main responsible for limestone corrosion. Moreover, the high resolution data (AMS-MINNI) allowed the identification of risk areas that are not visible with the low resolution data (EMEP modelling system) in all considered years and, especially, in the limestone case. Consequently, high resolution air quality simulations are suitable to provide concrete benefits in providing information for national effective policy against corrosion risk for cultural heritage, also in the context of climate changes that are affecting strongly Mediterranean basin. - Highlights: • Air pollution plays a pivotal role in the deterioration of cultural materials. • Limestone, copper and bronze corrosion levels decreased in Italy from 2003 to 2010. • PM 10 is considered the main responsible for limestone corrosion in Northern Italy. • HNO3 is considered the main responsible for limestone corrosion in all analyzed years. • High-resolution data are particularly useful to define area at risk for corrosion. - Importance of the high-resolution

  2. Crop area estimation using high and medium resolution satellite imagery in areas with complex topography

    Science.gov (United States)

    Husak, G. J.; Marshall, M. T.; Michaelsen, J.; Pedreros, D.; Funk, C.; Galu, G.

    2008-07-01

    Reliable estimates of cropped area (CA) in developing countries with chronic food shortages are essential for emergency relief and the design of appropriate market-based food security programs. Satellite interpretation of CA is an effective alternative to extensive and costly field surveys, which fail to represent the spatial heterogeneity at the country-level. Bias-corrected, texture based classifications show little deviation from actual crop inventories, when estimates derived from aerial photographs or field measurements are used to remove systematic errors in medium resolution estimates. In this paper, we demonstrate a hybrid high-medium resolution technique for Central Ethiopia that combines spatially limited unbiased estimates from IKONOS images, with spatially extensive Landsat ETM+ interpretations, land-cover, and SRTM-based topography. Logistic regression is used to derive the probability of a location being crop. These individual points are then aggregated to produce regional estimates of CA. District-level analysis of Landsat based estimates showed CA totals which supported the estimates of the Bureau of Agriculture and Rural Development. Continued work will evaluate the technique in other parts of Africa, while segmentation algorithms will be evaluated, in order to automate classification of medium resolution imagery for routine CA estimation in the future.

  3. Dose rate estimates and spatial interpolation maps of outdoor gamma dose rate with geostatistical methods; A case study from Artvin, Turkey

    International Nuclear Information System (INIS)

    Yeşilkanat, Cafer Mert; Kobya, Yaşar; Taşkin, Halim; Çevik, Uğur

    2015-01-01

    In this study, compliance of geostatistical estimation methods is compared to ensure investigation and imaging natural Fon radiation using the minimum number of data. Artvin province, which has a quite hilly terrain and wide variety of soil and located in the north–east of Turkey, is selected as the study area. Outdoor gamma dose rate (OGDR), which is an important determinant of environmental radioactivity level, is measured in 204 stations. Spatial structure of OGDR is determined by anisotropic, isotropic and residual variograms. Ordinary kriging (OK) and universal kriging (UK) interpolation estimations were calculated with the help of model parameters obtained from these variograms. In OK, although calculations are made based on positions of points where samples are taken, in the UK technique, general soil groups and altitude values directly affecting OGDR are included in the calculations. When two methods are evaluated based on their performances, it has been determined that UK model (r = 0.88, p < 0.001) gives quite better results than OK model (r = 0.64, p < 0.001). In addition, as a result of the maps created at the end of the study, it was illustrated that local changes are better reflected by UK method compared to OK method and its error variance is found to be lower. - Highlights: • The spatial dispersion of gamma dose rates in Artvin, which possesses one of the roughest lands in Turkey were studied. • The performance of different Geostatistic methods (OK and UK methods) for dispersion of gamma dose rates were compared. • Estimation values were calculated for non-sampling points by using the geostatistical model, the results were mapped. • The general radiological structure was determined in much less time with lower costs compared to experimental methods. • When theoretical methods are evaluated, it was obtained that UK gives more descriptive results compared to OK.

  4. A comparison between geostatistical analyses and sedimentological studies at the Hartbeestfontien gold mine

    International Nuclear Information System (INIS)

    Magri, E.J.

    1978-01-01

    For life-of-mine planning, as well as for short- and medium-term planning of grades and mine layouts, it is extremely important to have a clear understanding of the patterns followed by the distribution of gold and uranium within the mining area. This study is an attempt to reconcile the geostatistical approach to the determination of ore-shoot directions, via an analysis of the spatial distribution of gold and uranium values, with the sedimentological approach, which is based on the direct measurement of geological features. For the routine geostatistical estimation of ore reserves, the Hartebeestfontein gold mine was divided into ll sections. In each of these sections, the ore-shoot directions were calculated for gold and uranium from the anisotropies disclosed by geostatistical variogram analyses. This study presents a comparison of these results with those obtained from direct geological measurements of paleo-current directions. The results suggest that geological and geostatistical studies could be of significant mutual benefit [af

  5. Estimating Discharge in Low-Order Rivers With High-Resolution Aerial Imagery

    OpenAIRE

    King, Tyler V.; Neilson, Bethany T.; Rasmussen, Mitchell T.

    2018-01-01

    Remote sensing of river discharge promises to augment in situ gauging stations, but the majority of research in this field focuses on large rivers (>50 m wide). We present a method for estimating volumetric river discharge in low-order (wide) rivers from remotely sensed data by coupling high-resolution imagery with one-dimensional hydraulic modeling at so-called virtual gauging stations. These locations were identified as locations where the river contracted under low flows, exposing a substa...

  6. The Use of Geostatistics in the Study of Floral Phenology of Vulpia geniculata (L. Link

    Directory of Open Access Journals (Sweden)

    Eduardo J. León Ruiz

    2012-01-01

    Full Text Available Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L. Link throughout the study area during sampling season. Ten sampling points, scattered troughout the city and low mountains in the “Sierra de Córdoba” were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to ellaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.

  7. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    Science.gov (United States)

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.

    2012-09-01

    Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  8. Assessment of effectiveness of geologic isolation systems: geostatistical modeling of pore velocity

    International Nuclear Information System (INIS)

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses

  9. Spatial interpolation of climate variables in Northern Germany—Influence of temporal resolution and network density

    Directory of Open Access Journals (Sweden)

    C. Berndt

    2018-02-01

    New hydrological insights: Geostatistical techniques provide a better performance for all climate variables compared to simple methods Radar data improves the estimation of rainfall with hourly temporal resolution, while topography is useful for weekly to yearly values and temperature in general. No helpful information was found for cloudiness, sunshine duration, and wind speed, while interpolation of humidity benefitted from additional temperature data. The influences of temporal resolution, spatial variability, and additional information appear to be stronger than station density effects. High spatial variability of hourly precipitation causes the highest error, followed by wind speed, cloud coverage and sunshine duration. Lowest errors occur for temperature and humidity.

  10. Preliminary evaluation of uranium deposits. A geostatistical study of drilling density in Wyoming solution fronts

    International Nuclear Information System (INIS)

    Sandefur, R.L.; Grant, D.C.

    1976-01-01

    Studies of a roll-front uranium deposit in Shirley Basin Wyoming indicate that preliminary evaluation of the reserve potential of an ore body is possible with less drilling than currently practiced in industry. Estimating ore reserves from sparse drilling is difficult because most reserve calculation techniques do not give the accuracy of the estimate. A study of several deposits with a variety of drilling densities shows that geostatistics consistently provides a method of assessing the accuracy of an ore reserve estimate. Geostatistics provides the geologist with an additional descriptive technique - one which is valuable in the economic assessment of a uranium deposit. Closely spaced drilling on past properties provides both geological and geometric insight into the occurrence of uranium in roll-front type deposits. Just as the geological insight assists in locating new ore bodies and siting preferential drill locations, the geometric insight can be applied mathematically to evaluate the accuracy of a new ore reserve estimate. By expressing the geometry in numerical terms, geostatistics extracts important geological characteristics and uses this information to aid in describing the unknown characteristics of a property. (author)

  11. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    Directory of Open Access Journals (Sweden)

    Xujun Han

    Full Text Available The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL; the other is observation localization (OL. Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects.

  12. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    Science.gov (United States)

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects.

  13. Spatial Downscaling of TRMM Precipitation Using Geostatistics and Fine Scale Environmental Variables

    Directory of Open Access Journals (Sweden)

    No-Wook Park

    2013-01-01

    Full Text Available A geostatistical downscaling scheme is presented and can generate fine scale precipitation information from coarse scale Tropical Rainfall Measuring Mission (TRMM data by incorporating auxiliary fine scale environmental variables. Within the geostatistical framework, the TRMM precipitation data are first decomposed into trend and residual components. Quantitative relationships between coarse scale TRMM data and environmental variables are then estimated via regression analysis and used to derive trend components at a fine scale. Next, the residual components, which are the differences between the trend components and the original TRMM data, are then downscaled at a target fine scale via area-to-point kriging. The trend and residual components are finally added to generate fine scale precipitation estimates. Stochastic simulation is also applied to the residual components in order to generate multiple alternative realizations and to compute uncertainty measures. From an experiment using a digital elevation model (DEM and normalized difference vegetation index (NDVI, the geostatistical downscaling scheme generated the downscaling results that reflected detailed characteristics with better predictive performance, when compared with downscaling without the environmental variables. Multiple realizations and uncertainty measures from simulation also provided useful information for interpretations and further environmental modeling.

  14. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  15. A geostatistical methodology to assess the accuracy of unsaturated flow models

    International Nuclear Information System (INIS)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error

  16. Estimation of the high-spatial-resolution variability in extreme wind speeds for forestry applications

    Directory of Open Access Journals (Sweden)

    A. Venäläinen

    2017-07-01

    Full Text Available The bioeconomy has an increasing role to play in climate change mitigation and the sustainable development of national economies. In Finland, a forested country, over 50 % of the current bioeconomy relies on the sustainable management and utilization of forest resources. Wind storms are a major risk that forests are exposed to and high-spatial-resolution analysis of the most vulnerable locations can produce risk assessment of forest management planning. In this paper, we examine the feasibility of the wind multiplier approach for downscaling of maximum wind speed, using 20 m spatial resolution CORINE land-use dataset and high-resolution digital elevation data. A coarse spatial resolution estimate of the 10-year return level of maximum wind speed was obtained from the ERA-Interim reanalyzed data. Using a geospatial re-mapping technique the data were downscaled to 26 meteorological station locations to represent very diverse environments. Applying a comparison, we find that the downscaled 10-year return levels represent 66 % of the observed variation among the stations examined. In addition, the spatial variation in wind-multiplier-downscaled 10-year return level wind was compared with the WAsP model-simulated wind. The heterogeneous test area was situated in northern Finland, and it was found that the major features of the spatial variation were similar, but in some locations, there were relatively large differences. The results indicate that the wind multiplier method offers a pragmatic and computationally feasible tool for identifying at a high spatial resolution those locations with the highest forest wind damage risks. It can also be used to provide the necessary wind climate information for wind damage risk model calculations, thus making it possible to estimate the probability of predicted threshold wind speeds for wind damage and consequently the probability (and amount of wind damage for certain forest stand configurations.

  17. Applicability of geostatistical procedures for the evaluation of hydrogeological parameters of a fractured aquifer in the Ronneburg mine district

    International Nuclear Information System (INIS)

    Grasshoff, C.; Schetelig, K.; Tomschi, H.

    1998-01-01

    The following paper demonstrates, how a geostatistical approach can help interpolating hydrogeological parameters over a certain area. The basic elements developed by G. Matheron in the sixties are represented as the preconditions and assumptions, which provide the best results of the estimation. The variogram as the most important tool in geostatistics offers the opportunity to describe the correlating behaviour of a regionalized variable. Some kriging procedures are briefly introduced, which provide under varying circumstances estimating of non-measured values with the theoretical variogram-model. In the Ronneburg mine district 108 screened drill-holes could provide coefficients of hydraulic conductivity. These were interpolated with ordinary kriging over the whole investigation area. An error calculation was performed, which could prove the accuracy of the estimation. Short prospects point out some difficulties handling with geostatistic procedures and make suggestions for further investigations. (orig.) [de

  18. Geostatistical methods for radiological evaluation and risk analysis of contaminated premises

    International Nuclear Information System (INIS)

    Desnoyers, Y.; Jeannee, N.; Chiles, J.P.; Dubot, D.

    2009-01-01

    Full text: At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires the radiological assessment of residual activity levels of building structures. As stated by the IAEA, 'Segregation and characterization of contaminated materials are the key elements of waste minimization'. From this point of view, the set up of an appropriate evaluation methodology is of primordial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical, functional and qualitative information. Then, a systematic (exhaustive or not) control of the emergent signal is performed by means of in situ measurement methods such as surface control device combined with in situ gamma spectrometry. Besides, in order to assess the contamination depth, samples can be collected from boreholes at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data improve and reinforce the preliminary waste zoning. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. In this case, geostatistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Besides, the ability of this geostatistical framework to provide answers to several key issues that generally occur during the clean-up preparation phase is discussed: How to optimise the investigation costs? How to deal with data quality issues? How to consistently take into account auxiliary information such as historical

  19. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa.

    Directory of Open Access Journals (Sweden)

    Simon J O'Hanlon

    2016-01-01

    Full Text Available The initial endemicity (pre-control prevalence of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP before commencement of antivectorial and antiparasitic interventions.Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG approach to generate a continuous surface (at pixel resolution of 5 km x 5km of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson's correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2-90% in 1975.This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where data are sparse, and may be used to help inform the

  20. Geostatistical inference using crosshole ground-penetrating radar

    DEFF Research Database (Denmark)

    Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou

    2010-01-01

    of the subsurface are used to evaluate the uncertainty of the inversion estimate. We have explored the full potential of the geostatistical inference method using several synthetic models of varying correlation structures and have tested the influence of different assumptions concerning the choice of covariance...... reflection profile. Furthermore, the inferred values of the subsurface global variance and the mean velocity have been corroborated with moisturecontent measurements, obtained gravimetrically from samples collected at the field site....

  1. Geostatistical Analysis Methods for Estimation of Environmental Data Homogeneity

    Directory of Open Access Journals (Sweden)

    Aleksandr Danilov

    2018-01-01

    Full Text Available The methodology for assessing the spatial homogeneity of ecosystems with the possibility of subsequent zoning of territories in terms of the degree of disturbance of the environment is considered in the study. The degree of pollution of the water body was reconstructed on the basis of hydrochemical monitoring data and information on the level of the technogenic load in one year. As a result, the greatest environmental stress zones were isolated and correct zoning using geostatistical analysis techniques was proved. Mathematical algorithm computing system was implemented in an object-oriented programming C #. A software application has been obtained that allows quickly assessing the scale and spatial localization of pollution during the initial analysis of the environmental situation.

  2. Geostatistical methods for the integrated information; Metodos geoestadisticos para la integracion de informacion

    Energy Technology Data Exchange (ETDEWEB)

    Cassiraga, E F; Gomez-Hernandez, J J [Departamento de Ingenieria Hidraulica y Medio Ambiente, Universidad Politecnica de Valencia, Valencia (Spain)

    1996-10-01

    The main objective of this report is to describe the different geostatistical techniques to use the geophysical and hydrological parameters. We analyze the characteristics of estimation methods used in others studies.

  3. Downscaling remotely sensed imagery using area-to-point cokriging and multiple-point geostatistical simulation

    Science.gov (United States)

    Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong

    2015-03-01

    A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.

  4. Geostatistical radar-raingauge combination with nonparametric correlograms: methodological considerations and application in Switzerland

    Science.gov (United States)

    Schiemann, R.; Erdin, R.; Willi, M.; Frei, C.; Berenguer, M.; Sempere-Torres, D.

    2011-05-01

    Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation

  5. Estimation of Wheat Plant Density at Early Stages Using High Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Shouyang Liu

    2017-05-01

    Full Text Available Crop density is a key agronomical trait used to manage wheat crops and estimate yield. Visual counting of plants in the field is currently the most common method used. However, it is tedious and time consuming. The main objective of this work is to develop a machine vision based method to automate the density survey of wheat at early stages. RGB images taken with a high resolution RGB camera are classified to identify the green pixels corresponding to the plants. Crop rows are extracted and the connected components (objects are identified. A neural network is then trained to estimate the number of plants in the objects using the object features. The method was evaluated over three experiments showing contrasted conditions with sowing densities ranging from 100 to 600 seeds⋅m-2. Results demonstrate that the density is accurately estimated with an average relative error of 12%. The pipeline developed here provides an efficient and accurate estimate of wheat plant density at early stages.

  6. Estimated Depth Maps of the Northwestern Hawaiian Islands Derived from High Resolution IKONOS Satellite Imagery (Draft)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Estimated shallow-water, depth maps were produced using rule-based, semi-automated image analysis of high-resolution satellite imagery for nine locations in the...

  7. Delineating Hydrofacies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Song, Xuehang [Florida State Univ., Tallahassee, FL (United States); Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Xingyuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ye, Ming [Florida State Univ., Tallahassee, FL (United States); Dai, Zhenxue [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hammond, Glenn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This study develops a new framework of facies-based data assimilation for characterizing spatial distribution of hydrofacies and estimating their associated hydraulic properties. This framework couples ensemble data assimilation with transition probability-based geostatistical model via a parameterization based on a level set function. The nature of ensemble data assimilation makes the framework efficient and flexible to be integrated with various types of observation data. The transition probability-based geostatistical model keeps the updated hydrofacies distributions under geological constrains. The framework is illustrated by using a two-dimensional synthetic study that estimates hydrofacies spatial distribution and permeability in each hydrofacies from transient head data. Our results show that the proposed framework can characterize hydrofacies distribution and associated permeability with adequate accuracy even with limited direct measurements of hydrofacies. Our study provides a promising starting point for hydrofacies delineation in complex real problems.

  8. Is high-resolution inverse characterization of heterogeneous river bed hydraulic conductivities needed and possible?

    Directory of Open Access Journals (Sweden)

    W. Kurtz

    2013-10-01

    Full Text Available River–aquifer exchange fluxes influence local and regional water balances and affect groundwater and river water quality and quantity. Unfortunately, river–aquifer exchange fluxes tend to be strongly spatially variable, and it is an open research question to which degree river bed heterogeneity has to be represented in a model in order to achieve reliable estimates of river–aquifer exchange fluxes. This research question is addressed in this paper with the help of synthetic simulation experiments, which mimic the Limmat aquifer in Zurich (Switzerland, where river–aquifer exchange fluxes and groundwater management activities play an important role. The solution of the unsaturated–saturated subsurface hydrological flow problem including river–aquifer interaction is calculated for ten different synthetic realities where the strongly heterogeneous river bed hydraulic conductivities (L are perfectly known. Hydraulic head data (100 in the default scenario are sampled from the synthetic realities. In subsequent data assimilation experiments, where L is unknown now, the hydraulic head data are used as conditioning information, with the help of the ensemble Kalman filter (EnKF. For each of the ten synthetic realities, four different ensembles of L are tested in the experiments with EnKF; one ensemble estimates high-resolution L fields with different L values for each element, and the other three ensembles estimate effective L values for 5, 3 or 2 zones. The calibration of higher-resolution L fields (i.e. fully heterogeneous or 5 zones gives better results than the calibration of L for only 3 or 2 zones in terms of reproduction of states, stream–aquifer exchange fluxes and parameters. Effective L for a limited number of zones cannot always reproduce the true states and fluxes well and results in biased estimates of net exchange fluxes between aquifer and stream. Also in case only 10 head data are used for conditioning, the high-resolution

  9. Spatial analysis of groundwater levels using Fuzzy Logic and geostatistical tools

    Science.gov (United States)

    Theodoridou, P. G.; Varouchakis, E. A.; Karatzas, G. P.

    2017-12-01

    The spatial variability evaluation of the water table of an aquifer provides useful information in water resources management plans. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram is very important for the optimal method performance. This work compares three different criteria to assess the theoretical variogram that fits to the experimental one: the Least Squares Sum method, the Akaike Information Criterion and the Cressie's Indicator. Moreover, variable distance metrics such as the Euclidean, Minkowski, Manhattan, Canberra and Bray-Curtis are applied to calculate the distance between the observation and the prediction points, that affects both the variogram calculation and the Kriging estimator. A Fuzzy Logic System is then applied to define the appropriate neighbors for each estimation point used in the Kriging algorithm. The two criteria used during the Fuzzy Logic process are the distance between observation and estimation points and the groundwater level value at each observation point. The proposed techniques are applied to a data set of 250 hydraulic head measurements distributed over an alluvial aquifer. The analysis showed that the Power-law variogram model and Manhattan distance metric within ordinary kriging provide the best results when the comprehensive geostatistical analysis process is applied. On the other hand, the Fuzzy Logic approach leads to a Gaussian variogram model and significantly improves the estimation performance. The two different variogram models can be explained in terms of a fractional Brownian motion approach and of aquifer behavior at local scale. Finally, maps of hydraulic head spatial variability and of predictions uncertainty are constructed for the area with the two different approaches comparing their advantages and drawbacks.

  10. Retinal blood vessel segmentation in high resolution fundus photographs using automated feature parameter estimation

    Science.gov (United States)

    Orlando, José Ignacio; Fracchia, Marcos; del Río, Valeria; del Fresno, Mariana

    2017-11-01

    Several ophthalmological and systemic diseases are manifested through pathological changes in the properties and the distribution of the retinal blood vessels. The characterization of such alterations requires the segmentation of the vasculature, which is a tedious and time-consuming task that is infeasible to be performed manually. Numerous attempts have been made to propose automated methods for segmenting the retinal vasculature from fundus photographs, although their application in real clinical scenarios is usually limited by their ability to deal with images taken at different resolutions. This is likely due to the large number of parameters that have to be properly calibrated according to each image scale. In this paper we propose to apply a novel strategy for automated feature parameter estimation, combined with a vessel segmentation method based on fully connected conditional random fields. The estimation model is learned by linear regression from structural properties of the images and known optimal configurations, that were previously obtained for low resolution data sets. Our experiments in high resolution images show that this approach is able to estimate appropriate configurations that are suitable for performing the segmentation task without requiring to re-engineer parameters. Furthermore, our combined approach reported state of the art performance on the benchmark data set HRF, as measured in terms of the F1-score and the Matthews correlation coefficient.

  11. Wake-based ship route estimation in high-resolution SAR images

    Science.gov (United States)

    Graziano, M. Daniela; Rufino, Giancarlo; D'Errico, Marco

    2014-10-01

    This paper presents a novel algorithm for wake detection in Synthetic Aperture Radar images of the sea. The algorithm has been conceived as part of a ship traffic monitoring system, in charge of ship detection validation and to estimate ship route features, such as heading and ground speed. In addition, it has been intended to be adequate for inclusion in an automatic procedure without human operator supervision. The algorithm exploits the Radon transform to identify the images ship wake on the basis of the well known theoretical characteristics of the wakes' geometry and components, that are the turbulent wake, the narrow-V wakes, and the Kelvin arms, as well as the typical appearance of such components in Synthetic Aperture Radar images of the sea as bright or dark linear feature. Examples of application to high-resolution X-band Synthetic Aperture Radar products (COSMOSkymed and TerraSAR-X) are reported, both for wake detection and ship route estimation, showing the achieved quality and reliability of wake detection, adequacy to automatic procedures, as well as speed measure accuracy.

  12. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping.

    Science.gov (United States)

    Hampton, Kristen H; Serre, Marc L; Gesink, Dionne C; Pilcher, Christopher D; Miller, William C

    2011-10-06

    Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME) and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset.

  13. Topsoil moisture mapping using geostatistical techniques under different Mediterranean climatic conditions.

    Science.gov (United States)

    Martínez-Murillo, J F; Hueso-González, P; Ruiz-Sinoga, J D

    2017-10-01

    Soil mapping has been considered as an important factor in the widening of Soil Science and giving response to many different environmental questions. Geostatistical techniques, through kriging and co-kriging techniques, have made possible to improve the understanding of eco-geomorphologic variables, e.g., soil moisture. This study is focused on mapping of topsoil moisture using geostatistical techniques under different Mediterranean climatic conditions (humid, dry and semiarid) in three small watersheds and considering topography and soil properties as key factors. A Digital Elevation Model (DEM) with a resolution of 1×1m was derived from a topographical survey as well as soils were sampled to analyzed soil properties controlling topsoil moisture, which was measured during 4-years. Afterwards, some topography attributes were derived from the DEM, the soil properties analyzed in laboratory, and the topsoil moisture was modeled for the entire watersheds applying three geostatistical techniques: i) ordinary kriging; ii) co-kriging considering as co-variate topography attributes; and iii) co-kriging ta considering as co-variates topography attributes and gravel content. The results indicated topsoil moisture was more accurately mapped in the dry and semiarid watersheds when co-kriging procedure was performed. The study is a contribution to improve the efficiency and accuracy of studies about the Mediterranean eco-geomorphologic system and soil hydrology in field conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Science.gov (United States)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  15. A method of incident angle estimation for high resolution spectral recovery in filter-array-based spectrometers

    Science.gov (United States)

    Kim, Cheolsun; Lee, Woong-Bi; Ju, Gun Wu; Cho, Jeonghoon; Kim, Seongmin; Oh, Jinkyung; Lim, Dongsung; Lee, Yong Tak; Lee, Heung-No

    2017-02-01

    In recent years, there has been an increasing interest in miniature spectrometers for research and development. Especially, filter-array-based spectrometers have advantages of low cost and portability, and can be applied in various fields such as biology, chemistry and food industry. Miniaturization in optical filters causes degradation of spectral resolution due to limitations on spectral responses and the number of filters. Nowadays, many studies have been reported that the filter-array-based spectrometers have achieved resolution improvements by using digital signal processing (DSP) techniques. The performance of the DSP-based spectral recovery highly depends on the prior information of transmission functions (TFs) of the filters. The TFs vary with respect to an incident angle of light onto the filter-array. Conventionally, it is assumed that the incident angle of light on the filters is fixed and the TFs are known to the DSP. However, the incident angle is inconstant according to various environments and applications, and thus TFs also vary, which leads to performance degradation of spectral recovery. In this paper, we propose a method of incident angle estimation (IAE) for high resolution spectral recovery in the filter-array-based spectrometers. By exploiting sparse signal reconstruction of the L1- norm minimization, IAE estimates an incident angle among all possible incident angles which minimizes the error of the reconstructed signal. Based on IAE, DSP effectively provides a high resolution spectral recovery in the filter-array-based spectrometers.

  16. Estimating the numbers of malaria infections in blood samples using high-resolution genotyping data.

    Directory of Open Access Journals (Sweden)

    Amanda Ross

    Full Text Available People living in endemic areas often habour several malaria infections at once. High-resolution genotyping can distinguish between infections by detecting the presence of different alleles at a polymorphic locus. However the number of infections may not be accurately counted since parasites from multiple infections may carry the same allele. We use simulation to determine the circumstances under which the number of observed genotypes are likely to be substantially less than the number of infections present and investigate the performance of two methods for estimating the numbers of infections from high-resolution genotyping data. The simulations suggest that the problem is not substantial in most datasets: the disparity between the mean numbers of infections and of observed genotypes was small when there was 20 or more alleles, 20 or more blood samples, a mean number of infections of 6 or less and where the frequency of the most common allele was no greater than 20%. The issue of multiple infections carrying the same allele is unlikely to be a major component of the errors in PCR-based genotyping. Simulations also showed that, with heterogeneity in allele frequencies, the observed frequencies are not a good approximation of the true allele frequencies. The first method that we proposed to estimate the numbers of infections assumes that they are a good approximation and hence did poorly in the presence of heterogeneity. In contrast, the second method by Li et al estimates both the numbers of infections and the true allele frequencies simultaneously and produced accurate estimates of the mean number of infections.

  17. A review of surface energy balance models for estimating actual evapotranspiration with remote sensing at high spatiotemporal resolution over large extents

    Science.gov (United States)

    McShane, Ryan R.; Driscoll, Katelyn P.; Sando, Roy

    2017-09-27

    Many approaches have been developed for measuring or estimating actual evapotranspiration (ETa), and research over many years has led to the development of remote sensing methods that are reliably reproducible and effective in estimating ETa. Several remote sensing methods can be used to estimate ETa at the high spatial resolution of agricultural fields and the large extent of river basins. More complex remote sensing methods apply an analytical approach to ETa estimation using physically based models of varied complexity that require a combination of ground-based and remote sensing data, and are grounded in the theory behind the surface energy balance model. This report, funded through cooperation with the International Joint Commission, provides an overview of selected remote sensing methods used for estimating water consumed through ETa and focuses on Mapping Evapotranspiration at High Resolution with Internalized Calibration (METRIC) and Operational Simplified Surface Energy Balance (SSEBop), two energy balance models for estimating ETa that are currently applied successfully in the United States. The METRIC model can produce maps of ETa at high spatial resolution (30 meters using Landsat data) for specific areas smaller than several hundred square kilometers in extent, an improvement in practice over methods used more generally at larger scales. Many studies validating METRIC estimates of ETa against measurements from lysimeters have shown model accuracies on daily to seasonal time scales ranging from 85 to 95 percent. The METRIC model is accurate, but the greater complexity of METRIC results in greater data requirements, and the internalized calibration of METRIC leads to greater skill required for implementation. In contrast, SSEBop is a simpler model, having reduced data requirements and greater ease of implementation without a substantial loss of accuracy in estimating ETa. The SSEBop model has been used to produce maps of ETa over very large extents (the

  18. Integrating address geocoding, land use regression, and spatiotemporal geostatistical estimation for groundwater tetrachloroethylene.

    Science.gov (United States)

    Messier, Kyle P; Akita, Yasuyuki; Serre, Marc L

    2012-03-06

    Geographic information systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend.

  19. Modelling Geomechanical Heterogeneity of Rock Masses Using Direct and Indirect Geostatistical Conditional Simulation Methods

    Science.gov (United States)

    Eivazy, Hesameddin; Esmaieli, Kamran; Jean, Raynald

    2017-12-01

    An accurate characterization and modelling of rock mass geomechanical heterogeneity can lead to more efficient mine planning and design. Using deterministic approaches and random field methods for modelling rock mass heterogeneity is known to be limited in simulating the spatial variation and spatial pattern of the geomechanical properties. Although the applications of geostatistical techniques have demonstrated improvements in modelling the heterogeneity of geomechanical properties, geostatistical estimation methods such as Kriging result in estimates of geomechanical variables that are not fully representative of field observations. This paper reports on the development of 3D models for spatial variability of rock mass geomechanical properties using geostatistical conditional simulation method based on sequential Gaussian simulation. A methodology to simulate the heterogeneity of rock mass quality based on the rock mass rating is proposed and applied to a large open-pit mine in Canada. Using geomechanical core logging data collected from the mine site, a direct and an indirect approach were used to model the spatial variability of rock mass quality. The results of the two modelling approaches were validated against collected field data. The study aims to quantify the risks of pit slope failure and provides a measure of uncertainties in spatial variability of rock mass properties in different areas of the pit.

  20. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping

    Directory of Open Access Journals (Sweden)

    Pilcher Christopher D

    2011-10-01

    Full Text Available Abstract Background Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. Results In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Conclusions Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset.

  1. Comparison of ArcGIS and SAS Geostatistical Analyst to Estimate Population-Weighted Monthly Temperature for US Counties.

    Science.gov (United States)

    Xiaopeng, Q I; Liang, Wei; Barker, Laurie; Lekiachvili, Akaki; Xingyou, Zhang

    Temperature changes are known to have significant impacts on human health. Accurate estimates of population-weighted average monthly air temperature for US counties are needed to evaluate temperature's association with health behaviours and disease, which are sampled or reported at the county level and measured on a monthly-or 30-day-basis. Most reported temperature estimates were calculated using ArcGIS, relatively few used SAS. We compared the performance of geostatistical models to estimate population-weighted average temperature in each month for counties in 48 states using ArcGIS v9.3 and SAS v 9.2 on a CITGO platform. Monthly average temperature for Jan-Dec 2007 and elevation from 5435 weather stations were used to estimate the temperature at county population centroids. County estimates were produced with elevation as a covariate. Performance of models was assessed by comparing adjusted R 2 , mean squared error, root mean squared error, and processing time. Prediction accuracy for split validation was above 90% for 11 months in ArcGIS and all 12 months in SAS. Cokriging in SAS achieved higher prediction accuracy and lower estimation bias as compared to cokriging in ArcGIS. County-level estimates produced by both packages were positively correlated (adjusted R 2 range=0.95 to 0.99); accuracy and precision improved with elevation as a covariate. Both methods from ArcGIS and SAS are reliable for U.S. county-level temperature estimates; However, ArcGIS's merits in spatial data pre-processing and processing time may be important considerations for software selection, especially for multi-year or multi-state projects.

  2. Ultra-high resolution protein crystallography

    International Nuclear Information System (INIS)

    Takeda, Kazuki; Hirano, Yu; Miki, Kunio

    2010-01-01

    Many protein structures have been determined by X-ray crystallography and deposited with the Protein Data Bank. However, these structures at usual resolution (1.5< d<3.0 A) are insufficient in their precision and quantity for elucidating the molecular mechanism of protein functions directly from structural information. Several studies at ultra-high resolution (d<0.8 A) have been performed with synchrotron radiation in the last decade. The highest resolution of the protein crystals was achieved at 0.54 A resolution for a small protein, crambin. In such high resolution crystals, almost all of hydrogen atoms of proteins and some hydrogen atoms of bound water molecules are experimentally observed. In addition, outer-shell electrons of proteins can be analyzed by the multipole refinement procedure. However, the influence of X-rays should be precisely estimated in order to derive meaningful information from the crystallographic results. In this review, we summarize refinement procedures, current status and perspectives for ultra high resolution protein crystallography. (author)

  3. Improved Assimilation of Streamflow and Satellite Soil Moisture with the Evolutionary Particle Filter and Geostatistical Modeling

    Science.gov (United States)

    Yan, Hongxiang; Moradkhani, Hamid; Abbaszadeh, Peyman

    2017-04-01

    Assimilation of satellite soil moisture and streamflow data into hydrologic models using has received increasing attention over the past few years. Currently, these observations are increasingly used to improve the model streamflow and soil moisture predictions. However, the performance of this land data assimilation (DA) system still suffers from two limitations: 1) satellite data scarcity and quality; and 2) particle weight degeneration. In order to overcome these two limitations, we propose two possible solutions in this study. First, the general Gaussian geostatistical approach is proposed to overcome the limitation in the space/time resolution of satellite soil moisture products thus improving their accuracy at uncovered/biased grid cells. Secondly, an evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC, is developed to further reduce weight degeneration and improve the robustness of the land DA system. This study provides a detailed analysis of the joint and separate assimilation of streamflow and satellite soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed EPF-MCMC and the general Gaussian geostatistical approach. Performance is assessed over several basins in the USA selected from Model Parameter Estimation Experiment (MOPEX) and located in different climate regions. The results indicate that: 1) the general Gaussian approach can predict the soil moisture at uncovered grid cells within the expected satellite data quality threshold; 2) assimilation of satellite soil moisture inferred from the general Gaussian model can significantly improve the soil moisture predictions; and 3) in terms of both deterministic and probabilistic measures, the EPF-MCMC can achieve better streamflow predictions. These results recommend that the geostatistical model is a helpful tool to aid the remote sensing technique and the EPF-MCMC is a

  4. Spatial models for probabilistic prediction of wind power with application to annual-average and high temporal resolution data

    DEFF Research Database (Denmark)

    Lenzi, Amanda; Pinson, Pierre; Clemmensen, Line Katrine Harder

    2017-01-01

    average wind power generation, and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial correlation is captured by a latent Gaussian field. We explore how such models can be handled...... with stochastic partial differential approximations of Matérn Gaussian fields together with Integrated Nested Laplace Approximations. We demonstrate the proposed methods on wind farm data from Western Denmark, and compare the results to those obtained with standard geostatistical methods. The results show...

  5. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    International Nuclear Information System (INIS)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing

  6. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  7. Estimating uncertainty in resolution tests

    CSIR Research Space (South Africa)

    Goncalves, DP

    2006-05-01

    Full Text Available frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger un- certainty analysis. ? 2006 Society of Photo-Optical Instrumentation Engineers. H20851DOI: 10....1117/1.2202914H20852 Subject terms: resolution testing; USAF 1951 test target; resolution uncertainity. Paper 050404R received May 20, 2005; revised manuscript received Sep. 2, 2005; accepted for publication Sep. 9, 2005; published online May 10, 2006. 1...

  8. Estimation of the distribution of Tabebuia guayacan (Bignoniaceae) using high-resolution remote sensing imagery.

    Science.gov (United States)

    Sánchez-Azofeifa, Arturo; Rivard, Benoit; Wright, Joseph; Feng, Ji-Lu; Li, Peijun; Chong, Mei Mei; Bohlman, Stephanie A

    2011-01-01

    Species identification and characterization in tropical environments is an emerging field in tropical remote sensing. Significant efforts are currently aimed at the detection of tree species, of levels of forest successional stages, and the extent of liana occurrence at the top of canopies. In this paper we describe our use of high resolution imagery from the Quickbird Satellite to estimate the flowering population of Tabebuia guayacan trees at Barro Colorado Island (BCI), in Panama. The imagery was acquired on 29 April 2002 and 21 March 2004. Spectral Angle Mapping via a One-Class Support Vector machine was used to detect the presence of 422 and 557 flowering tress in the April 2002 and March 2004 imagery. Of these, 273 flowering trees are common to both dates. This study presents a new perspective on the effectiveness of high resolution remote sensing for monitoring a phenological response and its use as a tool for potential conservation and management of natural resources in tropical environments.

  9. A geostatistical study of the uranium deposit at Kvanefjeld, the Ilimaussaq intrusion, South Greenland

    International Nuclear Information System (INIS)

    Lund Clausen, F.

    1982-05-01

    The uranium deposit at Kvanefjeld within the Ilimaussaq intrusion in South Greenland has been tested by diamond drilling, hole logging, chip sampling and field gamma-spectrometric surveys. Based on these different types of spatially distributed samples the uranium variation within the deposit was studied. The spatial variation, which comprises a large random component, was modelled, and the intrinsic function was used to establish gradetonnage curves by the best linear unbiased estimator of geostatistics (kriging). From data obtained by a ground surface gamma-spectrometric survey it is shown that the uranium variation is possibly subject to a spatial anisotropy consistent with the geology. The uranium variation has a second-order stationarity. A global estimation of the total reserves shows that single block grade values are always estimated with high errors. This is mainly caused by the poor spatial structure and the very sparse sampling pattern. The best way to solve this problem appears to be a selective type of kriging. The overall uranium reserves are estimated as 23600 tons with a mean grade of 297 ppm (cutoff grade 250 ppm U). Studies of data from a test adit show that local geostatistical estimation can be done with acceptably small errors provided that a close sampling pattern is used. A regression relationship is established to correct field gamma-spectrometric measures of bulk grades towards truer values. Multivariate cluster and discriminant analyses were used to classify lujavrite samples based on their trace element content. Misclassification is due to a possibly continuous transition between naujakasite lujavrite and arfvedsonite lujavrite. Some of the main mineralogical differences between the geological units are identified by the discriminating effect of the individual variable. (author)

  10. Estimating Aboveground Biomass and Carbon Stocks in Periurban Andean Secondary Forests Using Very High Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Nicola Clerici

    2016-07-01

    Full Text Available Periurban forests are key to offsetting anthropogenic carbon emissions, but they are under constant threat from urbanization. In particular, secondary Neotropical forest types in Andean periurban areas have a high potential to store carbon, but are currently poorly characterized. To address this lack of information, we developed a method to estimate periurban aboveground biomass (AGB—a proxy for multiple ecosystem services—of secondary Andean forests near Bogotá, Colombia, based on very high resolution (VHR GeoEye-1, Pleiades-1A imagery and field-measured plot data. Specifically, we tested a series of different pre-processing workflows to derive six vegetation indices that were regressed against in situ estimates of AGB. Overall, the coupling of linear models and the Ratio Vegetation Index produced the most satisfactory results. Atmospheric and topographic correction proved to be key in improving model fit, especially in high aerosol and rugged terrain such as the Andes. Methods and findings provide baseline AGB and carbon stock information for little studied periurban Andean secondary forests. The methodological approach can also be used for integrating limited forest monitoring plot AGB data with very high resolution imagery for cost-effective modelling of ecosystem service provision from forests, monitoring reforestation and forest cover change, and for carbon offset assessments.

  11. Optimizing Groundwater Monitoring Networks Using Integrated Statistical and Geostatistical Approaches

    Directory of Open Access Journals (Sweden)

    Jay Krishna Thakur

    2015-08-01

    Full Text Available The aim of this work is to investigate new approaches using methods based on statistics and geo-statistics for spatio-temporal optimization of groundwater monitoring networks. The formulated and integrated methods were tested with the groundwater quality data set of Bitterfeld/Wolfen, Germany. Spatially, the monitoring network was optimized using geo-statistical methods. Temporal optimization of the monitoring network was carried out using Sen’s method (1968. For geostatistical network optimization, a geostatistical spatio-temporal algorithm was used to identify redundant wells in 2- and 2.5-D Quaternary and Tertiary aquifers. Influences of interpolation block width, dimension, contaminant association, groundwater flow direction and aquifer homogeneity on statistical and geostatistical methods for monitoring network optimization were analysed. The integrated approach shows 37% and 28% redundancies in the monitoring network in Quaternary aquifer and Tertiary aquifer respectively. The geostatistical method also recommends 41 and 22 new monitoring wells in the Quaternary and Tertiary aquifers respectively. In temporal optimization, an overall optimized sampling interval was recommended in terms of lower quartile (238 days, median quartile (317 days and upper quartile (401 days in the research area of Bitterfeld/Wolfen. Demonstrated methods for improving groundwater monitoring network can be used in real monitoring network optimization with due consideration given to influencing factors.

  12. Validating spatial structure in canopy water content using geostatistics

    Science.gov (United States)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  13. Approaching bathymetry estimation from high resolution multispectral satellite images using a neuro-fuzzy technique

    Science.gov (United States)

    Corucci, Linda; Masini, Andrea; Cococcioni, Marco

    2011-01-01

    This paper addresses bathymetry estimation from high resolution multispectral satellite images by proposing an accurate supervised method, based on a neuro-fuzzy approach. The method is applied to two Quickbird images of the same area, acquired in different years and meteorological conditions, and is validated using truth data. Performance is studied in different realistic situations of in situ data availability. The method allows to achieve a mean standard deviation of 36.7 cm for estimated water depths in the range [-18, -1] m. When only data collected along a closed path are used as a training set, a mean STD of 45 cm is obtained. The effect of both meteorological conditions and training set size reduction on the overall performance is also investigated.

  14. A Merging Framework for Rainfall Estimation at High Spatiotemporal Resolution for Distributed Hydrological Modeling in a Data-Scarce Area

    Directory of Open Access Journals (Sweden)

    Yinping Long

    2016-07-01

    Full Text Available Merging satellite and rain gauge data by combining accurate quantitative rainfall from stations with spatial continuous information from remote sensing observations provides a practical method of estimating rainfall. However, generating high spatiotemporal rainfall fields for catchment-distributed hydrological modeling is a problem when only a sparse rain gauge network and coarse spatial resolution of satellite data are available. The objective of the study is to present a satellite and rain gauge data-merging framework adapting for coarse resolution and data-sparse designs. In the framework, a statistical spatial downscaling method based on the relationships among precipitation, topographical features, and weather conditions was used to downscale the 0.25° daily rainfall field derived from the Tropical Rainfall Measuring Mission (TRMM Multisatellite Precipitation Analysis (TMPA precipitation product version 7. The nonparametric merging technique of double kernel smoothing, adapting for data-sparse design, was combined with the global optimization method of shuffled complex evolution, to merge the downscaled TRMM and gauged rainfall with minimum cross-validation error. An indicator field representing the presence and absence of rainfall was generated using the indicator kriging technique and applied to the previously merged result to consider the spatial intermittency of daily rainfall. The framework was applied to estimate daily precipitation at a 1 km resolution in the Qinghai Lake Basin, a data-scarce area in the northeast of the Qinghai-Tibet Plateau. The final estimates not only captured the spatial pattern of daily and annual precipitation with a relatively small estimation error, but also performed very well in stream flow simulation when applied to force the geomorphology-based hydrological model (GBHM. The proposed framework thus appears feasible for rainfall estimation at high spatiotemporal resolution in data-scarce areas.

  15. Regional-scale geostatistical inverse modeling of North American CO2 fluxes: a synthetic data study

    Directory of Open Access Journals (Sweden)

    A. M. Michalak

    2010-07-01

    Full Text Available A series of synthetic data experiments is performed to investigate the ability of a regional atmospheric inversion to estimate grid-scale CO2 fluxes during the growing season over North America. The inversions are performed within a geostatistical framework without the use of any prior flux estimates or auxiliary variables, in order to focus on the atmospheric constraint provided by the nine towers collecting continuous, calibrated CO2 measurements in 2004. Using synthetic measurements and their associated concentration footprints, flux and model-data mismatch covariance parameters are first optimized, and then fluxes and their uncertainties are estimated at three different temporal resolutions. These temporal resolutions, which include a four-day average, a four-day-average diurnal cycle with 3-hourly increments, and 3-hourly fluxes, are chosen to help assess the impact of temporal aggregation errors on the estimated fluxes and covariance parameters. Estimating fluxes at a temporal resolution that can adjust the diurnal variability is found to be critical both for recovering covariance parameters directly from the atmospheric data, and for inferring accurate ecoregion-scale fluxes. Accounting for both spatial and temporal a priori covariance in the flux distribution is also found to be necessary for recovering accurate a posteriori uncertainty bounds on the estimated fluxes. Overall, the results suggest that even a fairly sparse network of 9 towers collecting continuous CO2 measurements across the continent, used with no auxiliary information or prior estimates of the flux distribution in time or space, can be used to infer relatively accurate monthly ecoregion scale CO2 surface fluxes over North America within estimated uncertainty bounds. Simulated random transport error is shown to decrease the quality of flux estimates in under-constrained areas at the ecoregion scale, although the uncertainty bounds remain realistic. While these synthetic

  16. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...

  17. Using Very High Resolution Remotely Sensed Imagery to Estimate Agricultural Production: A comparison of food insecure and secure growing areas in Kenya

    Science.gov (United States)

    Grace, K.; Husak, G. J.; Bogle, S.

    2013-12-01

    Determining the amount of food produced in a food insecure, isolated, subsistence farming community can be used to help identify households or communities who may be in need of additional food resources. Measuring annual food production in developing countries, much less at a sub-national level, is complicated by lack of data. It can be difficult and costly to access all of the farming households engaged in subsistence farming. However, recent research has focused on the use of remotely sensed data to aid in the estimation of area under cultivation and because food production is the measure of yield (production per hectare) multiplied by area (number of hectares), we can use the area measure to reduce uncertainty in food production estimates. One strategy for estimating cultivated area relies on a fairly time intensive manual interpretation of very high resolution data. Due to the availability of very high resolution data it is possible to construct estimates of cultivated area, even in communities where fields are small. While this strategy has been used to effectively estimate cultivated area in a timely manner, questions remain about the spatial and temporal generalizability of this approach. The purpose of this paper is to produce and compare estimates of cultivated area in two very different agricultural areas of Kenya, a highly food insecure country in East Africa, during two different agricultural seasons. The areas selected represent two different livelihood zones: a marginal growing area where poor farmers rely on inconsistent rainfall and a lush growing area near the mountainous region of the middle-West area of the country where rainfall is consistent and therefore more suited to cultivation. The overarching goal is to determine the effectiveness of very high resolution remotely sensed imagery in calculating estimates of cultivated area in areas where food production strategies are different. Additionally the results of this research will explore the

  18. The Relative Performance of High Resolution Quantitative Precipitation Estimates in the Russian River Basin

    Science.gov (United States)

    Bytheway, J. L.; Biswas, S.; Cifelli, R.; Hughes, M.

    2017-12-01

    The Russian River carves a 110 mile path through Mendocino and Sonoma counties in western California, providing water for thousands of residents and acres of agriculture as well as a home for several species of endangered fish. The Russian River basin receives almost all of its precipitation during the October through March wet season, and the systems bringing this precipitation are often impacted by atmospheric river events as well as the complex topography of the region. This study will examine the performance of several high resolution (hourly, products and forecasts over the 2015-2016 and 2016-2017 wet seasons. Comparisons of event total rainfall as well as hourly rainfall will be performed using 1) rain gauges operated by the National Oceanic and Atmospheric Administration (NOAA) Physical Sciences Division (PSD), 2) products from the Multi-Radar/Multi-Sensor (MRMS) QPE dataset, and 3) quantitative precipitation forecasts from the High Resolution Rapid Refresh (HRRR) model at 1, 3, 6, and 12 hour lead times. Further attention will be given to cases or locations representing large disparities between the estimates.

  19. A geostatistical approach to identify and mitigate agricultural nitrous oxide emission hotspots.

    Science.gov (United States)

    Turner, P A; Griffis, T J; Mulla, D J; Baker, J M; Venterea, R T

    2016-12-01

    Anthropogenic emissions of nitrous oxide (N 2 O), a trace gas with severe environmental costs, are greatest from agricultural soils amended with nitrogen (N) fertilizer. However, accurate N 2 O emission estimates at fine spatial scales are made difficult by their high variability, which represents a critical challenge for the management of N 2 O emissions. Here, static chamber measurements (n=60) and soil samples (n=129) were collected at approximately weekly intervals (n=6) for 42-d immediately following the application of N in a southern Minnesota cornfield (15.6-ha), typical of the systems prevalent throughout the U.S. Corn Belt. These data were integrated into a geostatistical model that resolved N 2 O emissions at a high spatial resolution (1-m). Field-scale N 2 O emissions exhibited a high degree of spatial variability, and were partitioned into three classes of emission strength: hotspots, intermediate, and coldspots. Rates of emission from hotspots were 2-fold greater than non-hotspot locations. Consequently, 36% of the field-scale emissions could be attributed to hotspots, despite representing only 21% of the total field area. Variations in elevation caused hotspots to develop in predictable locations, which were prone to nutrient and moisture accumulation caused by terrain focusing. Because these features are relatively static, our data and analyses indicate that targeted management of hotspots could efficiently reduce field-scale emissions by as much 17%, a significant benefit considering the deleterious effects of atmospheric N 2 O. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Risk Assessment of Sediment Pollution Using Geostatistical Simulations

    Science.gov (United States)

    Golay, J.; Kanevski, M.

    2012-04-01

    Environmental monitoring networks (EMN) discreetly measure the intensities of continuous phenomena (e.g. pollution, temperature, etc.). Spatial prediction models, like kriging, are then used for modeling. But, they give rise to smooth representations of phenomena which leads to overestimations or underestimations of extreme values. Moreover, they do not reproduce the spatial variability of the original data and the corresponding uncertainties. When dealing with risk assessment, this is unacceptable, since extreme values must be retrieved and probabilities of exceeding given thresholds must be computed [Kanevski et al., 2009]. In order to overcome these obstacles, geostatistics provides another approach: conditional stochastic simulations. Here, the basic idea is to generate multiple estimates of variable values (e.g. pollution concentration) at every location of interest which are calculated as stochastic realizations of an unknown random function (see, for example, [Kanevski, 2008], where both theoretical concepts and real data case studies are presented in detail). Many algorithms implement this approach. The most widely used in spatial modeling are sequential Gaussian simulations/cosimulations, sequential indicator simulations/cosimulations and direct simulations. In the present study, several algorithms of geostatistical conditional simulations were applied on real data collected from Lake Geneva. The main objectives were to compare their effectiveness in reproducing global statistics (histograms, variograms) and the way they characterize the variability and uncertainty of the contamination patterns. The dataset is composed of 200 measurements of the contamination of the lake sediments by heavy metals (i.e. Cadmium, Mercury, Zinc, Copper, Titanium and Chromium). The results obtained show some differences highlighting that risk assessment can be influenced by the algorithm it relies on. Moreover, hybrid models based on machine learning algorithms and

  1. A postprocessing method based on high-resolution spectral estimation for FDTD calculation of phononic band structures

    Energy Technology Data Exchange (ETDEWEB)

    Su Xiaoxing, E-mail: xxsu@bjtu.edu.c [School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing 100044 (China); Li Jianbao; Wang Yuesheng [Institute of Engineering Mechanics, Beijing Jiaotong University, Beijing 100044 (China)

    2010-05-15

    If the energy bands of a phononic crystal are calculated by the finite difference time domain (FDTD) method combined with the fast Fourier transform (FFT), good estimation of the eigenfrequencies can only be ensured by the postprocessing of sufficiently long time series generated by a large number of FDTD iterations. In this paper, a postprocessing method based on the high-resolution spectral estimation via the Yule-Walker method is proposed to overcome this difficulty. Numerical simulation results for three-dimensional acoustic and two-dimensional elastic systems show that, compared with the classic FFT-based postprocessing method, the proposed method can give much better estimation of the eigenfrequencies when the FDTD is run with relatively few iterations.

  2. A postprocessing method based on high-resolution spectral estimation for FDTD calculation of phononic band structures

    International Nuclear Information System (INIS)

    Su Xiaoxing; Li Jianbao; Wang Yuesheng

    2010-01-01

    If the energy bands of a phononic crystal are calculated by the finite difference time domain (FDTD) method combined with the fast Fourier transform (FFT), good estimation of the eigenfrequencies can only be ensured by the postprocessing of sufficiently long time series generated by a large number of FDTD iterations. In this paper, a postprocessing method based on the high-resolution spectral estimation via the Yule-Walker method is proposed to overcome this difficulty. Numerical simulation results for three-dimensional acoustic and two-dimensional elastic systems show that, compared with the classic FFT-based postprocessing method, the proposed method can give much better estimation of the eigenfrequencies when the FDTD is run with relatively few iterations.

  3. Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models

    Science.gov (United States)

    Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.

    2013-12-01

    We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.

  4. Estimates of greenhouse gas and black carbon emissions from a major Australian wildfire with high spatiotemporal resolution

    Science.gov (United States)

    Surawski, N. C.; Sullivan, A. L.; Roxburgh, S. H.; Polglase, P. J.

    2016-08-01

    Estimates of greenhouse gases and particulate emissions are made with a high spatiotemporal resolution from the Kilmore East fire in Victoria, Australia, which burnt approximately 100,000 ha over a 12 h period. Altogether, 10,175 Gigagrams (Gg) of CO2 equivalent (CO2-e) emissions occurred, with CO2 (˜68%) being the dominant chemical species emitted followed by CH4 (˜17%) and black carbon (BC) (˜15%). About 63% of total CO2-e emissions were estimated to be from coarse woody debris, 22% were from surface fuels, 7% from bark, 6% from elevated fuels, and less than 2% from tree crown consumption. To assess the quality of our emissions estimates, we compared our results with previous estimates which used the Global Fire Emissions Database version 3.1 (GFEDv3.1) and the Fire INventory from the National Center for Atmospheric Research version 1.0 (FINNv1), as well as Australia's National Inventory System (and its revision). The uncertainty in emission estimates was addressed using truncated Monte Carlo analysis, which derived a probability density function for total emissions from the uncertainties in each input. The distribution of emission estimates from Monte Carlo analysis was lognormal with a mean of 10,355 Gigagrams (Gg) and a ±1 standard deviation (σ) uncertainty range of 7260-13,450 Gg. Results were in good agreement with the global data sets (when using the same burnt area), although they predicted lower total emissions by 15-37% due to underestimating fuel consumed. Emissions estimates can be improved by obtaining better estimates of fuel consumed and BC emission factors. Overall, this study presents a methodological template for high-resolution emissions accounting and its uncertainty, enabling a step toward process-based emissions accounting to be achieved.

  5. High Resolution/High Fidelity Seismic Imaging and Parameter Estimation for Geological Structure and Material Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Ru-Shan Wu; Xiao-Bi Xie

    2008-06-08

    Our proposed work on high resolution/high fidelity seismic imaging focused on three general areas: (1) development of new, more efficient, wave-equation-based propagators and imaging conditions, (2) developments towards amplitude-preserving imaging in the local angle domain, in particular, imaging methods that allow us to estimate the reflection as a function of angle at a layer boundary, and (3) studies of wave inversion for local parameter estimation. In this report we summarize the results and progress we made during the project period. The report is divided into three parts, totaling 10 chapters. The first part is on resolution analysis and its relation to directional illumination analysis. The second part, which is composed of 6 chapters, is on the main theme of our work, the true-reflection imaging. True-reflection imaging is an advanced imaging technology which aims at keeping the image amplitude proportional to the reflection strength of the local reflectors or to obtain the reflection coefficient as function of reflection-angle. There are many factors which may influence the image amplitude, such as geometrical spreading, transmission loss, path absorption, acquisition aperture effect, etc. However, we can group these into two categories: one is the propagator effect (geometric spreading, path losses); the other is the acquisition-aperture effect. We have made significant progress in both categories. We studied the effects of different terms in the true-amplitude one-way propagators, especially the terms including lateral velocity variation of the medium. We also demonstrate the improvements by optimizing the expansion coefficients in different terms. Our research also includes directional illumination analysis for both the one-way propagators and full-wave propagators. We developed the fast acquisition-aperture correction method in the local angle-domain, which is an important element in the true-reflection imaging. Other developments include the super

  6. Estimating chlorophyll with thermal and broadband multispectral high resolution imagery from an unmanned aerial system using relevance vector machines for precision agriculture

    Science.gov (United States)

    Elarab, Manal; Ticlavilca, Andres M.; Torres-Rua, Alfonso F.; Maslova, Inga; McKee, Mac

    2015-12-01

    Precision agriculture requires high-resolution information to enable greater precision in the management of inputs to production. Actionable information about crop and field status must be acquired at high spatial resolution and at a temporal frequency appropriate for timely responses. In this study, high spatial resolution imagery was obtained through the use of a small, unmanned aerial system called AggieAirTM. Simultaneously with the AggieAir flights, intensive ground sampling for plant chlorophyll was conducted at precisely determined locations. This study reports the application of a relevance vector machine coupled with cross validation and backward elimination to a dataset composed of reflectance from high-resolution multi-spectral imagery (VIS-NIR), thermal infrared imagery, and vegetative indices, in conjunction with in situ SPAD measurements from which chlorophyll concentrations were derived, to estimate chlorophyll concentration from remotely sensed data at 15-cm resolution. The results indicate that a relevance vector machine with a thin plate spline kernel type and kernel width of 5.4, having LAI, NDVI, thermal and red bands as the selected set of inputs, can be used to spatially estimate chlorophyll concentration with a root-mean-squared-error of 5.31 μg cm-2, efficiency of 0.76, and 9 relevance vectors.

  7. Data analysis for radiological characterisation: Geostatistical and statistical complementarity

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    Radiological characterisation may cover a large range of evaluation objectives during a decommissioning and dismantling (D and D) project: removal of doubt, delineation of contaminated materials, monitoring of the decontamination work and final survey. At each stage, collecting relevant data to be able to draw the conclusions needed is quite a big challenge. In particular two radiological characterisation stages require an advanced sampling process and data analysis, namely the initial categorization and optimisation of the materials to be removed and the final survey to demonstrate compliance with clearance levels. On the one hand the latter is widely used and well developed in national guides and norms, using random sampling designs and statistical data analysis. On the other hand a more complex evaluation methodology has to be implemented for the initial radiological characterisation, both for sampling design and for data analysis. The geostatistical framework is an efficient way to satisfy the radiological characterisation requirements providing a sound decision-making approach for the decommissioning and dismantling of nuclear premises. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. Thus geo-statistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, leading to a sound classification of radiological waste (surfaces and volumes). This way, the radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive or not) surface survey of the contamination is implemented on a regular grid. Finally, in order to assess activity levels and contamination depths, destructive samples are collected at several locations within the premises (based on the surface survey results) and analysed. Combined with

  8. Geostatistical analyses and hazard assessment on soil lead in Silvermines area, Ireland

    International Nuclear Information System (INIS)

    McGrath, David; Zhang Chaosheng; Carton, Owen T.

    2004-01-01

    Spatial distribution and hazard assessment of soil lead in the mining site of Silvermines, Ireland, were investigated using statistics, geostatistics and geographic information system (GIS) techniques. Positively skewed distribution and possible outlying values of Pb and other heavy metals were observed. Box-Cox transformation was applied in order to achieve normality in the data set and to reduce the effect of outliers. Geostatistical analyses were carried out, including calculation of experimental variograms and model fitting. The ordinary point kriging estimates of Pb concentration were mapped. Kriging standard deviations were regarded as the standard deviations of the interpolated pixel values, and a second map was produced, that quantified the probability of Pb concentration higher than a threshold value of 1000 mg/kg. These maps provide valuable information for hazard assessment and for decision support. - A probability map was produced that was useful for hazard assessment and decision support

  9. Geostatistical analyses and hazard assessment on soil lead in Silvermines area, Ireland

    Energy Technology Data Exchange (ETDEWEB)

    McGrath, David; Zhang Chaosheng; Carton, Owen T

    2004-01-01

    Spatial distribution and hazard assessment of soil lead in the mining site of Silvermines, Ireland, were investigated using statistics, geostatistics and geographic information system (GIS) techniques. Positively skewed distribution and possible outlying values of Pb and other heavy metals were observed. Box-Cox transformation was applied in order to achieve normality in the data set and to reduce the effect of outliers. Geostatistical analyses were carried out, including calculation of experimental variograms and model fitting. The ordinary point kriging estimates of Pb concentration were mapped. Kriging standard deviations were regarded as the standard deviations of the interpolated pixel values, and a second map was produced, that quantified the probability of Pb concentration higher than a threshold value of 1000 mg/kg. These maps provide valuable information for hazard assessment and for decision support. - A probability map was produced that was useful for hazard assessment and decision support.

  10. Gstat: a program for geostatistical modelling, prediction and simulation

    Science.gov (United States)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  11. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    International Nuclear Information System (INIS)

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given

  12. Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches

    Science.gov (United States)

    Klump, J. F.; Fouedjio, F.

    2017-12-01

    Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.

  13. How to evaluate the risks of exceeding limits: geostatistical models and their application to air pollution

    International Nuclear Information System (INIS)

    Fouquet, Ch. de; Deraisme, J.; Bobbia, M.

    2007-01-01

    Geo-statistics is increasingly applied to the study of environmental risks in a variety of sectors, especially in the fields of soil decontamination and the evaluation of the risks due to air pollution. Geo-statistics offers a rigorous stochastic modeling approach that makes it possible to answer questions expressed in terms of uncertainty and risk. This article focusses on nonlinear geo-statistical methods, based on the Gaussian random function model, whose essential properties are summarised. We use two examples to characterize situations where direct and thus rapid methods provide appropriate solutions and cases that inevitably require more laborious simulation techniques. Exposure of the population of the Rouen metropolitan area to the risk of NO 2 pollution is assessed by simulations, but the surface area where the pollution exceeds the threshold limit can be easily estimated with nonlinear conditional expectation techniques. A second example is used to discuss the bias introduced by direct simulation, here of a percentile of daily SO 2 concentration for one year in the city of Le Havre; an operational solution is proposed. (authors)

  14. Fast and accurate phylogenetic reconstruction from high-resolution whole-genome data and a novel robustness estimator.

    Science.gov (United States)

    Lin, Y; Rajan, V; Moret, B M E

    2011-09-01

    The rapid accumulation of whole-genome data has renewed interest in the study of genomic rearrangements. Comparative genomics, evolutionary biology, and cancer research all require models and algorithms to elucidate the mechanisms, history, and consequences of these rearrangements. However, even simple models lead to NP-hard problems, particularly in the area of phylogenetic analysis. Current approaches are limited to small collections of genomes and low-resolution data (typically a few hundred syntenic blocks). Moreover, whereas phylogenetic analyses from sequence data are deemed incomplete unless bootstrapping scores (a measure of confidence) are given for each tree edge, no equivalent to bootstrapping exists for rearrangement-based phylogenetic analysis. We describe a fast and accurate algorithm for rearrangement analysis that scales up, in both time and accuracy, to modern high-resolution genomic data. We also describe a novel approach to estimate the robustness of results-an equivalent to the bootstrapping analysis used in sequence-based phylogenetic reconstruction. We present the results of extensive testing on both simulated and real data showing that our algorithm returns very accurate results, while scaling linearly with the size of the genomes and cubically with their number. We also present extensive experimental results showing that our approach to robustness testing provides excellent estimates of confidence, which, moreover, can be tuned to trade off thresholds between false positives and false negatives. Together, these two novel approaches enable us to attack heretofore intractable problems, such as phylogenetic inference for high-resolution vertebrate genomes, as we demonstrate on a set of six vertebrate genomes with 8,380 syntenic blocks. A copy of the software is available on demand.

  15. Spatial distribution of Munida intermedia and M. sarsi (crustacea: Anomura) on the Galician continental shelf (NW Spain): Application of geostatistical analysis

    Science.gov (United States)

    Freire, J.; González-Gurriarán, E.; Olaso, I.

    1992-12-01

    Geostatistical methodology was used to analyse spatial structure and distribution of the epibenthic crustaceans Munida intermedia and M. sarsi within sets of data which had been collected during three survey cruises carried out on the Galician continental shelf (1983 and 1984). This study investigates the feasibility of using geostatistics for data collected according to traditional methods and of enhancing such methodology. The experimental variograms were calculated (pooled variance minus spatial covariance between samples taken one pair at a time vs. distance) and fitted to a 'spherical' model. The spatial structure model was used to estimate the abundance and distribution of the populations studied using the technique of kriging. The species display spatial structures, which are well marked during high density periods and in some areas (especially northern shelf). Geostatistical analysis allows identification of the density gradients in space as well as the patch grain along the continental shelf of 16-25 km diameter for M. intermedia and 12-20 km for M. sarsi. Patches of both species have a consistent location throughout the different cruises. As in other geographical areas, M. intermedia and M. sarsi usually appear at depths ranging from 200 to 500 m, with the highest densities in the continental shelf area located between Fisterra and Estaca de Bares. Althouh sampling was not originally designed specifically for geostatistics, this assay provides a measurement of spatial covariance, and shows variograms with variable structure depending on population density and geographical area. These ideas are useful in improving the design of future sampling cruises.

  16. Biomass estimation with high resolution satellite images: A case study of Quercus rotundifolia

    Science.gov (United States)

    Sousa, Adélia M. O.; Gonçalves, Ana Cristina; Mesquita, Paulo; Marques da Silva, José R.

    2015-03-01

    Forest biomass has had a growing importance in the world economy as a global strategic reserve, due to applications in bioenergy, bioproduct development and issues related to reducing greenhouse gas emissions. Current techniques used for forest inventory are usually time consuming and expensive. Thus, there is an urgent need to develop reliable, low cost methods that can be used for forest biomass estimation and monitoring. This study uses new techniques to process high spatial resolution satellite images (0.70 m) in order to assess and monitor forest biomass. Multi-resolution segmentation method and object oriented classification are used to obtain the area of tree canopy horizontal projection for Quercus rotundifolia. Forest inventory allows for calculation of tree and canopy horizontal projection and biomass, the latter with allometric functions. The two data sets are used to develop linear functions to assess above ground biomass, with crown horizontal projection as an independent variable. The functions for the cumulative values, both for inventory and satellite data, for a prediction error equal or smaller than the Portuguese national forest inventory (7%), correspond to stand areas of 0.5 ha, which include most of the Q.rotundifolia stands.

  17. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    International Nuclear Information System (INIS)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-01-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system

  18. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Science.gov (United States)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-02-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  19. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, Mohd Khairul Bazli Mohd, E-mail: mkbazli@yahoo.com [Centre of Preparatory and General Studies, TATI University College, 24000 Kemaman, Terengganu, Malaysia and Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusof, Fadhilah, E-mail: fadhilahy@utm.my [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Daud, Zalina Mohd, E-mail: zalina@ic.utm.my [UTM Razak School of Engineering and Advanced Technology, Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia); Yusop, Zulkifli, E-mail: zulyusop@utm.my [Institute of Environmental and Water Resource Management (IPASA), Faculty of Civil Engineering, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Kasno, Mohammad Afif, E-mail: mafifkasno@gmail.com [Malaysia - Japan International Institute of Technology (MJIIT), Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia)

    2015-02-03

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  20. Geostatistics and GIS: tools for characterizing environmental contamination.

    Science.gov (United States)

    Henshaw, Shannon L; Curriero, Frank C; Shields, Timothy M; Glass, Gregory E; Strickland, Paul T; Breysse, Patrick N

    2004-08-01

    Geostatistics is a set of statistical techniques used in the analysis of georeferenced data that can be applied to environmental contamination and remediation studies. In this study, the 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) contamination at a Superfund site in western Maryland is evaluated. Concern about the site and its future clean up has triggered interest within the community because residential development surrounds the area. Spatial statistical methods, of which geostatistics is a subset, are becoming increasingly popular, in part due to the availability of geographic information system (GIS) software in a variety of application packages. In this article, the joint use of ArcGIS software and the R statistical computing environment are demonstrated as an approach for comprehensive geostatistical analyses. The spatial regression method, kriging, is used to provide predictions of DDE levels at unsampled locations both within the site and the surrounding areas where residential development is ongoing.

  1. Estimating spatially distributed turbulent heat fluxes from high-resolution thermal imagery acquired with a UAV system.

    Science.gov (United States)

    Brenner, Claire; Thiem, Christina Elisabeth; Wizemann, Hans-Dieter; Bernhardt, Matthias; Schulz, Karsten

    2017-05-19

    In this study, high-resolution thermal imagery acquired with a small unmanned aerial vehicle (UAV) is used to map evapotranspiration (ET) at a grassland site in Luxembourg. The land surface temperature (LST) information from the thermal imagery is the key input to a one-source and two-source energy balance model. While the one-source model treats the surface as a single uniform layer, the two-source model partitions the surface temperature and fluxes into soil and vegetation components. It thus explicitly accounts for the different contributions of both components to surface temperature as well as turbulent flux exchange with the atmosphere. Contrary to the two-source model, the one-source model requires an empirical adjustment parameter in order to account for the effect of the two components. Turbulent heat flux estimates of both modelling approaches are compared to eddy covariance (EC) measurements using the high-resolution input imagery UAVs provide. In this comparison, the effect of different methods for energy balance closure of the EC data on the agreement between modelled and measured fluxes is also analysed. Additionally, the sensitivity of the one-source model to the derivation of the empirical adjustment parameter is tested. Due to the very dry and hot conditions during the experiment, pronounced thermal patterns developed over the grassland site. These patterns result in spatially variable turbulent heat fluxes. The model comparison indicates that both models are able to derive ET estimates that compare well with EC measurements under these conditions. However, the two-source model, with a more complex treatment of the energy and surface temperature partitioning between the soil and vegetation, outperformed the simpler one-source model in estimating sensible and latent heat fluxes. This is consistent with findings from prior studies. For the one-source model, a time-variant expression of the adjustment parameter (to account for the difference between

  2. Monte Carlo full-waveform inversion of crosshole GPR data using multiple-point geostatistical a priori information

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2012-01-01

    We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... inverse problems, such as full-waveform inversion. Sequential Gibbs sampling is a method that allows efficient sampling of a priori probability densities described by geostatistical algorithms based on either two-point (e.g., Gaussian) or multiple-point statistics. We outline the theoretical framework......) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional...

  3. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling

    Directory of Open Access Journals (Sweden)

    Laura Grisotto

    2016-04-01

    Full Text Available In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.

  4. Geostatistical enhancement of european hydrological predictions

    Science.gov (United States)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  5. Characterisation and geostatistical analysis of clay rocks in underground facilities using hyper-spectral images

    International Nuclear Information System (INIS)

    Becker, J.K.; Marschall, P.; Brunner, P.; Cholet, C.; Renard, P.; Buckley, S.; Kurz, T.

    2012-01-01

    , and are readily available as spectral libraries for use in software processing packages. Since rocks are composites of minerals, their spectra represent a mixture of spectra of the constituent minerals concerning the reflectance. In general, imaging spectrometry allows a semi-quantitative analysis of mineral abundances from rock spectra, for example by analysing the intensity of absorption bands. In many cases a mineral with a unique absorption signature can be correlated to a specific lithological unit, which can be used to trace and map the lithology. Additionally, abundance and spatial variation can be determined from the rock spectra. Common reflection features in sedimentary rocks are typically related to carbonate and clay minerals, hydroxyl, water or iron-bearing material and weathering products. A number of physical properties can influence the intensity of features in the spectral curves of minerals and rocks, such as particle size, angle of incidence, porosity and surface roughness, though the wavelength positions of the absorption features are not changed. Next to the obvious ability to use the hyper-spectral images to 'visually' correlate layers within a rock over a certain distance they can also be used for a more rigorous approach of geostatistical correlation. We have developed a work flow for this approach using the hyper-spectral image classifications: 1. In a first step, image reconstruction must be performed. During the scanning and possibly also later during classification, some areas of the hyper-spectral images may not be completely usable or some pixels may not have been classified. In this case, the 'holes' should be filled using multiple-point geostatistical techniques. 2. In the present example, images at three different resolutions have been taken. It is envisaged to use the high resolution images and simulate the high resolution over the entire rock face in a way that the high resolution simulations are guided by the low resolution images

  6. Geostatistical regularization operators for geophysical inverse problems on irregular meshes

    Science.gov (United States)

    Jordi, C.; Doetsch, J.; Günther, T.; Schmelzbach, C.; Robertsson, J. OA

    2018-05-01

    Irregular meshes allow to include complicated subsurface structures into geophysical modelling and inverse problems. The non-uniqueness of these inverse problems requires appropriate regularization that can incorporate a priori information. However, defining regularization operators for irregular discretizations is not trivial. Different schemes for calculating smoothness operators on irregular meshes have been proposed. In contrast to classical regularization constraints that are only defined using the nearest neighbours of a cell, geostatistical operators include a larger neighbourhood around a particular cell. A correlation model defines the extent of the neighbourhood and allows to incorporate information about geological structures. We propose an approach to calculate geostatistical operators for inverse problems on irregular meshes by eigendecomposition of a covariance matrix that contains the a priori geological information. Using our approach, the calculation of the operator matrix becomes tractable for 3-D inverse problems on irregular meshes. We tested the performance of the geostatistical regularization operators and compared them against the results of anisotropic smoothing in inversions of 2-D surface synthetic electrical resistivity tomography (ERT) data as well as in the inversion of a realistic 3-D cross-well synthetic ERT scenario. The inversions of 2-D ERT and seismic traveltime field data with geostatistical regularization provide results that are in good accordance with the expected geology and thus facilitate their interpretation. In particular, for layered structures the geostatistical regularization provides geologically more plausible results compared to the anisotropic smoothness constraints.

  7. Air-sea exchange over Black Sea estimated from high resolution regional climate simulations

    Science.gov (United States)

    Velea, Liliana; Bojariu, Roxana; Cica, Roxana

    2013-04-01

    Black Sea is an important influencing factor for the climate of bordering countries, showing cyclogenetic activity (Trigo et al, 1999) and influencing Mediterranean cyclones passing over. As for other seas, standard observations of the atmosphere are limited in time and space and available observation-based estimations of air-sea exchange terms present quite large ranges of uncertainty. The reanalysis datasets (e.g. ERA produced by ECMWF) provide promising validation estimates of climatic characteristics against the ones in available climatic data (Schrum et al, 2001), while cannot reproduce some local features due to relatively coarse horizontal resolution. Detailed and realistic information on smaller-scale processes are foreseen to be provided by regional climate models, due to continuous improvements of physical parameterizations and numerical solutions and thus affording simulations at high spatial resolution. The aim of the study is to assess the potential of three regional climate models in reproducing known climatological characteristics of air-sea exchange over Black Sea, as well as to explore the added value of the model compared to the input (reanalysis) data. We employ results of long-term (1961-2000) simulations performed within ENSEMBLE project (http://ensemblesrt3.dmi.dk/) using models ETHZ-CLM, CNRM-ALADIN, METO-HadCM, for which the integration domain covers the whole area of interest. The analysis is performed for the entire basin for several variables entering the heat and water budget terms and available as direct output from the models, at seasonal and annual scale. A comparison with independent data (ERA-INTERIM) and findings from other studies (e.g. Schrum et al, 2001) is also presented. References: Schrum, C., Staneva, J., Stanev, E. and Ozsoy, E., 2001: Air-sea exchange in the Black Sea estimated from atmospheric analysis for the period 1979-1993, J. Marine Systems, 31, 3-19 Trigo, I. F., T. D. Davies, and G. R. Bigg (1999): Objective

  8. The Fire INventory from NCAR (FINN: a high resolution global model to estimate the emissions from open burning

    Directory of Open Access Journals (Sweden)

    C. Wiedinmyer

    2011-07-01

    Full Text Available The Fire INventory from NCAR version 1.0 (FINNv1 provides daily, 1 km resolution, global estimates of the trace gas and particle emissions from open burning of biomass, which includes wildfire, agricultural fires, and prescribed burning and does not include biofuel use and trash burning. Emission factors used in the calculations have been updated with recent data, particularly for the non-methane organic compounds (NMOC. The resulting global annual NMOC emission estimates are as much as a factor of 5 greater than some prior estimates. Chemical speciation profiles, necessary to allocate the total NMOC emission estimates to lumped species for use by chemical transport models, are provided for three widely used chemical mechanisms: SAPRC99, GEOS-CHEM, and MOZART-4. Using these profiles, FINNv1 also provides global estimates of key organic compounds, including formaldehyde and methanol. Uncertainties in the emissions estimates arise from several of the method steps. The use of fire hot spots, assumed area burned, land cover maps, biomass consumption estimates, and emission factors all introduce error into the model estimates. The uncertainty in the FINNv1 emission estimates are about a factor of two; but, the global estimates agree reasonably well with other global inventories of biomass burning emissions for CO, CO2, and other species with less variable emission factors. FINNv1 emission estimates have been developed specifically for modeling atmospheric chemistry and air quality in a consistent framework at scales from local to global. The product is unique because of the high temporal and spatial resolution, global coverage, and the number of species estimated. FINNv1 can be used for both hindcast and forecast or near-real time model applications and the results are being critically evaluated with models and observations whenever possible.

  9. Geostatistical modeling of groundwater properties and assessment of their uncertainties

    International Nuclear Information System (INIS)

    Honda, Makoto; Yamamoto, Shinya; Sakurai, Hideyuki; Suzuki, Makoto; Sanada, Hiroyuki; Matsui, Hiroya; Sugita, Yutaka

    2010-01-01

    The distribution of groundwater properties is important for understanding of the deep underground hydrogeological environments. This paper proposes a geostatistical system for modeling the groundwater properties which have a correlation with the ground resistivity data obtained from widespread and exhaustive survey. That is, the methodology for the integration of resistivity data measured by various methods and the methodology for modeling the groundwater properties using the integrated resistivity data has been developed. The proposed system has also been validated using the data obtained in the Horonobe Underground Research Laboratory project. Additionally, the quantification of uncertainties in the estimated model has been tried by numerical simulations based on the data. As a result, the uncertainties of the proposal model have been estimated lower than other traditional model's. (author)

  10. Empirical Estimation of Total Nitrogen and Total Phosphorus Concentration of Urban Water Bodies in China Using High Resolution IKONOS Multispectral Imagery

    Directory of Open Access Journals (Sweden)

    Jiaming Liu

    2015-11-01

    Full Text Available Measuring total nitrogen (TN and total phosphorus (TP is important in managing heavy polluted urban waters in China. This study uses high spatial resolution IKONOS imagery with four multispectral bands, which roughly correspond to Landsat/TM bands 1–4, to determine TN and TP in small urban rivers and lakes in China. By using Lake Cihu and the lower reaches of Wen-Rui Tang (WRT River as examples, this paper develops both multiple linear regressions (MLR and artificial neural network (ANN models to estimate TN and TP concentrations from high spatial resolution remote sensing imagery and in situ water samples collected concurrently with overpassing satellite. The measured and estimated values of both MLR and ANN models are in good agreement (R2 > 0.85 and RMSE < 2.50. The empirical equations selected by MLR are more straightforward, whereas the estimated accuracy using ANN model is better (R2 > 0.86 and RMSE < 0.89. Results validate the potential of using high resolution IKONOS multispectral imagery to study the chemical states of small-sized urban water bodies. The spatial distribution maps of TN and TP concentrations generated by the ANN model can inform the decision makers of variations in water quality in Lake Cihu and lower reaches of WRT River. The approaches and equations developed in this study could be applied to other urban water bodies for water quality monitoring.

  11. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    Science.gov (United States)

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.

  12. Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model

    Science.gov (United States)

    Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef

    2016-10-01

    We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.

  13. Application of Geostatistics to the resolution of structural problems in homogeneous rocky massifs

    International Nuclear Information System (INIS)

    Lucero Michaut, H.N.

    1985-01-01

    The nature and possibilities of application of intrinsic functions to the structural research and the delimitation of the areas of influence in an ore deposit are briefly described. Main models to which the different distributions may be assimilated: 'logarithmic' and 'linear' among those with no sill value, and on the other hand, 'spherical', 'exponential' and 'gaussian' among those having a sill level, which allows the establishment of a range value liable to separate the field of independent samples from that of non-independent ones are shown. Thereafter as an original contribution to applied geostatistics the autor postulates 1) the application of the 'fracturing rank' as a regionalized variable after verifying its validity through strict probabilistic methodologies, and 2) a methodological extension of the conventional criterion of 'rock quality designation' to the analysis of the quality and degree of structural discontinuity in the rock surface. Finally, some examples are given of these applications. (M.E.L.) [es

  14. Evaluating the effect of sampling and spatial correlation on ground-water travel time uncertainty coupling geostatistical, stochastic, and first order, second moment methods

    International Nuclear Information System (INIS)

    Andrews, R.W.; LaVenue, A.M.; McNeish, J.A.

    1989-01-01

    Ground-water travel time predictions at potential high-level waste repositories are subject to a degree of uncertainty due to the scale of averaging incorporated in conceptual models of the ground-water flow regime as well as the lack of data on the spatial variability of the hydrogeologic parameters. The present study describes the effect of limited observations of a spatially correlated permeability field on the predicted ground-water travel time uncertainty. Varying permeability correlation lengths have been used to investigate the importance of this geostatistical property on the tails of the travel time distribution. This study uses both geostatistical and differential analysis techniques. Following the generation of a spatially correlated permeability field which is considered reality, semivariogram analyses are performed upon small random subsets of the generated field to determine the geostatistical properties of the field represented by the observations. Kriging is then employed to generate a kriged permeability field and the corresponding standard deviation of the estimated field conditioned by the limited observations. Using both the real and kriged fields, the ground-water flow regime is simulated and ground-water travel paths and travel times are determined for various starting points. These results are used to define the ground-water travel time uncertainty due to path variability. The variance of the ground-water travel time along particular paths due to the variance of the permeability field estimated using kriging is then calculated using the first order, second moment method. The uncertainties in predicted travel time due to path and parameter uncertainties are then combined into a single distribution

  15. Derivation and analysis of a high-resolution estimate of global permafrost zonation

    Directory of Open Access Journals (Sweden)

    S. Gruber

    2012-02-01

    Full Text Available Permafrost underlies much of Earth's surface and interacts with climate, eco-systems and human systems. It is a complex phenomenon controlled by climate and (sub- surface properties and reacts to change with variable delay. Heterogeneity and sparse data challenge the modeling of its spatial distribution. Currently, there is no data set to adequately inform global studies of permafrost. The available data set for the Northern Hemisphere is frequently used for model evaluation, but its quality and consistency are difficult to assess. Here, a global model of permafrost extent and dataset of permafrost zonation are presented and discussed, extending earlier studies by including the Southern Hemisphere, by consistent data and methods, by attention to uncertainty and scaling. Established relationships between air temperature and the occurrence of permafrost are re-formulated into a model that is parametrized using published estimates. It is run with a high-resolution (<1 km global elevation data and air temperatures based on the NCAR-NCEP reanalysis and CRU TS 2.0. The resulting data provide more spatial detail and a consistent extrapolation to remote regions, while aggregated values resemble previous studies. The estimated uncertainties affect regional patterns and aggregate number, and provide interesting insight. The permafrost area, i.e. the actual surface area underlain by permafrost, north of 60° S is estimated to be 13–18 × 106 km2 or 9–14 % of the exposed land surface. The global permafrost area including Antarctic and sub-sea permafrost is estimated to be 16–21 × 106 km2. The global permafrost region, i.e. the exposed land surface below which some permafrost can be expected, is estimated to be 22 ± 3 × 106 km2. A large proportion of this exhibits considerable topography and spatially-discontinuous permafrost, underscoring the importance of attention to scaling issues

  16. Estimation of mean tree stand volume using high-resolution aerial RGB imagery and digital surface model, obtained from sUAV and Trestima mobile application

    Directory of Open Access Journals (Sweden)

    G. K. Rybakov

    2017-06-01

    Full Text Available This study considers a remote sensing technique for mean volume estimation based on a very high-resolution (VHR aerial RGB imagery obtained using a small-sized unmanned aerial vehicle (sUAV and a high-resolution photogrammetric digital surface model (DSM as well as an innovative technology for field measurements (Trestima. The study area covers approx. 220 ha of forestland in Finland. The work concerns the entire process from remote sensing and field data acquisition to statistical analysis and forest volume wall-to-wall mapping. The study showed that the VHR aerial imagery and the high-resolution DSM produced based on the application of the sUAV have good prospects for forest inventory. For the sUAV based estimation of forest variables such as Height, Basal Area and mean Volume, Root Mean Square Error constituted 6.6 %, 22.6 % and 26.7 %, respectively. Application of Trestima for estimation of the mean volume of the standing forest showed minor difference over the existing Forest Management Plan at all the selected forest compartments. Simultaneously, the results of the study confirmed that the technologies and the tools applied at this work could be a reliable and potentially cost-effective means of forest data acquisition with high potential of operational use.

  17. Combining photorealistic immersive geovisualization and high-resolution geospatial data to enhance human-scale viewshed modelling

    Science.gov (United States)

    Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and

  18. 3D vadose zone modeling using geostatistical inferences

    International Nuclear Information System (INIS)

    Knutson, C.F.; Lee, C.B.

    1991-01-01

    In developing a 3D model of the 600 ft thick interbedded basalt and sediment complex that constitutes the vadose zone at the Radioactive Waste Management Complex (RWMC) at the Idaho National Engineering Laboratory (INEL) geostatistical data were captured for 12--15 parameters (e.g. permeability, porosity, saturation, etc. and flow height, flow width, flow internal zonation, etc.). This two scale data set was generated from studies of subsurface core and geophysical log suites at RWMC and from surface outcrop exposures located at the Box Canyon of the Big Lost River and from Hell's Half Acre lava field all located in the general RWMC area. Based on these currently available data, it is possible to build a 3D stochastic model that utilizes: cumulative distribution functions obtained from the geostatistical data; backstripping and rebuilding of stratigraphic units; an ''expert'' system that incorporates rules based on expert geologic analysis and experimentally derived geostatistics for providing: (a) a structural and isopach map of each layer, (b) a realization of the flow geometry of each basalt flow unit, and (c) a realization of the internal flow parameters (eg permeability, porosity, and saturation) for each flow. 10 refs., 4 figs., 1 tab

  19. High-Resolution Spatial Distribution and Estimation of Access to Improved Sanitation in Kenya.

    Science.gov (United States)

    Jia, Peng; Anderson, John D; Leitner, Michael; Rheingans, Richard

    2016-01-01

    Access to sanitation facilities is imperative in reducing the risk of multiple adverse health outcomes. A distinct disparity in sanitation exists among different wealth levels in many low-income countries, which may hinder the progress across each of the Millennium Development Goals. The surveyed households in 397 clusters from 2008-2009 Kenya Demographic and Health Surveys were divided into five wealth quintiles based on their national asset scores. A series of spatial analysis methods including excess risk, local spatial autocorrelation, and spatial interpolation were applied to observe disparities in coverage of improved sanitation among different wealth categories. The total number of the population with improved sanitation was estimated by interpolating, time-adjusting, and multiplying the surveyed coverage rates by high-resolution population grids. A comparison was then made with the annual estimates from United Nations Population Division and World Health Organization /United Nations Children's Fund Joint Monitoring Program for Water Supply and Sanitation. The Empirical Bayesian Kriging interpolation produced minimal root mean squared error for all clusters and five quintiles while predicting the raw and spatial coverage rates of improved sanitation. The coverage in southern regions was generally higher than in the north and east, and the coverage in the south decreased from Nairobi in all directions, while Nyanza and North Eastern Province had relatively poor coverage. The general clustering trend of high and low sanitation improvement among surveyed clusters was confirmed after spatial smoothing. There exists an apparent disparity in sanitation among different wealth categories across Kenya and spatially smoothed coverage rates resulted in a closer estimation of the available statistics than raw coverage rates. Future intervention activities need to be tailored for both different wealth categories and nationally where there are areas of greater needs when

  20. High-Resolution Spatial Distribution and Estimation of Access to Improved Sanitation in Kenya.

    Directory of Open Access Journals (Sweden)

    Peng Jia

    Full Text Available Access to sanitation facilities is imperative in reducing the risk of multiple adverse health outcomes. A distinct disparity in sanitation exists among different wealth levels in many low-income countries, which may hinder the progress across each of the Millennium Development Goals.The surveyed households in 397 clusters from 2008-2009 Kenya Demographic and Health Surveys were divided into five wealth quintiles based on their national asset scores. A series of spatial analysis methods including excess risk, local spatial autocorrelation, and spatial interpolation were applied to observe disparities in coverage of improved sanitation among different wealth categories. The total number of the population with improved sanitation was estimated by interpolating, time-adjusting, and multiplying the surveyed coverage rates by high-resolution population grids. A comparison was then made with the annual estimates from United Nations Population Division and World Health Organization /United Nations Children's Fund Joint Monitoring Program for Water Supply and Sanitation.The Empirical Bayesian Kriging interpolation produced minimal root mean squared error for all clusters and five quintiles while predicting the raw and spatial coverage rates of improved sanitation. The coverage in southern regions was generally higher than in the north and east, and the coverage in the south decreased from Nairobi in all directions, while Nyanza and North Eastern Province had relatively poor coverage. The general clustering trend of high and low sanitation improvement among surveyed clusters was confirmed after spatial smoothing.There exists an apparent disparity in sanitation among different wealth categories across Kenya and spatially smoothed coverage rates resulted in a closer estimation of the available statistics than raw coverage rates. Future intervention activities need to be tailored for both different wealth categories and nationally where there are areas of

  1. ESTIMATION OF STAND HEIGHT AND FOREST VOLUME USING HIGH RESOLUTION STEREO PHOTOGRAPHY AND FOREST TYPE MAP

    Directory of Open Access Journals (Sweden)

    K. M. Kim

    2016-06-01

    Full Text Available Traditional field methods for measuring tree heights are often too costly and time consuming. An alternative remote sensing approach is to measure tree heights from digital stereo photographs which is more practical for forest managers and less expensive than LiDAR or synthetic aperture radar. This work proposes an estimation of stand height and forest volume(m3/ha using normalized digital surface model (nDSM from high resolution stereo photography (25cm resolution and forest type map. The study area was located in Mt. Maehwa model forest in Hong Chun-Gun, South Korea. The forest type map has four attributes such as major species, age class, DBH class and crown density class by stand. Overlapping aerial photos were taken in September 2013 and digital surface model (DSM was created by photogrammetric methods(aerial triangulation, digital image matching. Then, digital terrain model (DTM was created by filtering DSM and subtracted DTM from DSM pixel by pixel, resulting in nDSM which represents object heights (buildings, trees, etc.. Two independent variables from nDSM were used to estimate forest stand volume: crown density (% and stand height (m. First, crown density was calculated using canopy segmentation method considering live crown ratio. Next, stand height was produced by averaging individual tree heights in a stand using Esri’s ArcGIS and the USDA Forest Service’s FUSION software. Finally, stand volume was estimated and mapped using aerial photo stand volume equations by species which have two independent variables, crown density and stand height. South Korea has a historical imagery archive which can show forest change in 40 years of successful forest rehabilitation. For a future study, forest volume change map (1970s–present will be produced using this stand volume estimation method and a historical imagery archive.

  2. High-Resolution Forest Canopy Height Estimation in an African Blue Carbon Ecosystem

    Science.gov (United States)

    Lagomasino, David; Fatoyinbo, Temilola; Lee, Seung-Kuk; Simard, Marc

    2015-01-01

    Mangrove forests are one of the most productive and carbon dense ecosystems that are only found at tidally inundated coastal areas. Forest canopy height is an important measure for modeling carbon and biomass dynamics, as well as land cover change. By taking advantage of the flat terrain and dense canopy cover, the present study derived digital surface models (DSMs) using stereophotogrammetric techniques on high-resolution spaceborne imagery (HRSI) for southern Mozambique. A mean-weighted ground surface elevation factor was subtracted from the HRSI DSM to accurately estimate the canopy height in mangrove forests in southern Mozambique. The mean and H100 tree height measured in both the field and with the digital canopy model provided the most accurate results with a vertical error of 1.18-1.84 m, respectively. Distinct patterns were identified in the HRSI canopy height map that could not be discerned from coarse shuttle radar topography mission canopy maps even though the mode and distribution of canopy heights were similar over the same area. Through further investigation, HRSI DSMs have the potential of providing a new type of three-dimensional dataset that could serve as calibration/validation data for other DSMs generated from spaceborne datasets with much larger global coverage. HSRI DSMs could be used in lieu of Lidar acquisitions for canopy height and forest biomass estimation, and be combined with passive optical data to improve land cover classifications.

  3. 2nd European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Soares, Amílcar; Froidevaux, Roland

    1999-01-01

    The Second European Conference on Geostatistics for Environmental Ap­ plications took place in Valencia, November 18-20, 1998. Two years have past from the first meeting in Lisbon and the geostatistical community has kept active in the environmental field. In these days of congress inflation, we feel that continuity can only be achieved by ensuring quality in the papers. For this reason, all papers in the book have been reviewed by, at least, two referees, and care has been taken to ensure that the reviewer comments have been incorporated in the final version of the manuscript. We are thankful to the members of the scientific committee for their timely review of the scripts. All in all, there are three keynote papers from experts in soil science, climatology and ecology and 43 contributed papers providing a good indication of the status of geostatistics as applied in the environ­ mental field all over the world. We feel now confident that the geoENV conference series, seeded around a coffee table almost six...

  4. Model validation and error estimation of tsunami runup using high resolution data in Sadeng Port, Gunungkidul, Yogyakarta

    Science.gov (United States)

    Basith, Abdul; Prakoso, Yudhono; Kongko, Widjo

    2017-07-01

    A tsunami model using high resolution geometric data is indispensable in efforts to tsunami mitigation, especially in tsunami prone areas. It is one of the factors that affect the accuracy results of numerical modeling of tsunami. Sadeng Port is a new infrastructure in the Southern Coast of Java which could potentially hit by massive tsunami from seismic gap. This paper discusses validation and error estimation of tsunami model created using high resolution geometric data in Sadeng Port. Tsunami model validation uses the height wave of Tsunami Pangandaran 2006 recorded by Tide Gauge of Sadeng. Tsunami model will be used to accommodate the tsunami numerical modeling involves the parameters of earthquake-tsunami which is derived from the seismic gap. The validation results using t-test (student) shows that the height of the tsunami modeling results and observation in Tide Gauge of Sadeng are considered statistically equal at 95% confidence level and the value of the RMSE and NRMSE are 0.428 m and 22.12%, while the differences of tsunami wave travel time is 12 minutes.

  5. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  6. Rapid non-destructive quantitative estimation of urania/ thoria in mixed thorium uranium di-oxide pellets by high-resolution gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Shriwastwa, B.B.; Kumar, Anil; Raghunath, B.; Nair, M.R.; Abani, M.C.; Ramachandran, R.; Majumdar, S.; Ghosh, J.K

    2001-06-01

    A non-destructive technique using high-resolution gamma-ray spectrometry has been standardised for quantitative estimation of uranium/thorium in mixed (ThO{sub 2}-UO{sub 2}) fuel pellets of varying composition. Four gamma energies were selected; two each from the uranium and thorium series and the time of counting has been optimised. This technique can be used for rapid estimation of U/Th percentage in a large number of mixed fuel pellets from a production campaign.

  7. Rapid non-destructive quantitative estimation of urania/ thoria in mixed thorium uranium di-oxide pellets by high-resolution gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Shriwastwa, B.B.; Kumar, Anil; Raghunath, B.; Nair, M.R.; Abani, M.C.; Ramachandran, R.; Majumdar, S.; Ghosh, J.K.

    2001-01-01

    A non-destructive technique using high-resolution gamma-ray spectrometry has been standardised for quantitative estimation of uranium/thorium in mixed (ThO 2 -UO 2 ) fuel pellets of varying composition. Four gamma energies were selected; two each from the uranium and thorium series and the time of counting has been optimised. This technique can be used for rapid estimation of U/Th percentage in a large number of mixed fuel pellets from a production campaign

  8. Comparison of the large-scale radon risk map for southern Belgium with results of high resolution surveys

    International Nuclear Information System (INIS)

    Zhu, H.-C.; Charlet, J.M.; Poffijn, A.

    2000-01-01

    A large-scale radon survey consisting of long-term measurements in about 5200 singe-family houses in the southern part of Belgium was carried from 1995 to 1999. A radon risk map for the region was produced using geostatistical and GIS approaches. Some communes or villages situated within high risk areas were chosen for detailed surveys. A high resolution radon survey with about 330 measurements was performed in half part of the commune of Burg-Reuland. Comparison of radon maps on quite different scales shows that the general Rn risk map has similar pattern as the radon map for the detailed study area. Another detailed radon survey in the village of Hatrival, situated in a high radon area, found very high proportion of houses with elevated radon concentrations. The results of this detailed survey are comparable to the expectation for high risk areas on the large-scale radon risk map. The good correspondence between the findings of the general risk map and the analysis of the limited detailed surveys, suggests that the large-scale radon risk map is likely reliable. (author)

  9. Toward an estimation of daily european CO2 fluxes at high spatial resolution by inversion of atmospheric transport

    International Nuclear Information System (INIS)

    Carouge, C.

    2006-04-01

    distribution over Europe. To study the potential of this method, we used synthetic data generated from forward simulations of LMDZt (driven by flux fields generated from the biosphere model ORCHIDEE). We have found that the current network is not dense enough to constrain fluxes at model resolution. However, fluxes that are aggregated spatially over a region of 850 x 850 km in the Western Europe and temporally over 8-10 days compare very well with the ORCHIDEE fluxes. Preliminary inversion results using real data indicate that synoptic variations of estimated fluxes are in phase with the variations of the ORCHIDEE biosphere model flux and the variations observed in atmospheric concentrations. However, the quality of the flux estimates are highly dependant on transport model errors and in particular, on the quality of modelling small scale transport. Moreover, fossil fuel emissions are prescribed in this inverse model and the quality of their distribution is shown to be crucial. Data selection also has a large impact on estimated fluxes. The use of the daytime only data to calculate daily averaged concentrations greatly improves the estimated fluxes by reducing bias inferred from model transport errors. (author)

  10. Spatiotemporal estimation of air temperature patterns at the street level using high resolution satellite imagery.

    Science.gov (United States)

    Pelta, Ran; Chudnovsky, Alexandra A

    2017-02-01

    Although meteorological monitoring stations provide accurate measurements of Air Temperature (AT), their spatial coverage within a given region is limited and thus is often insufficient for exposure and epidemiological studies. In many applications, satellite imagery measures energy flux, which is spatially continuous, and calculates Brightness Temperature (BT) that used as an input parameter. Although both quantities (AT-BT) are physically related, the correlation between them is not straightforward, and varies daily due to parameters such as meteorological conditions, surface moisture, land use, satellite-surface geometry and others. In this paper we first investigate the relationship between AT and BT as measured by 39 meteorological stations in Israel during 1984-2015. Thereafter, we apply mixed regression models with daily random slopes to calibrate Landsat BT data with monitored AT measurements for the period 1984-2015. Results show that AT can be predicted with high accuracy by using BT with high spatial resolution. The model shows relatively high accuracy estimation of AT (R 2 =0.92, RMSE=1.58°C, slope=0.90). Incorporating meteorological parameters into the model generates better accuracy (R 2 =0.935) than the AT-BT model (R 2 =0.92). Furthermore, based on the relatively high model accuracy, we investigated the spatial patterns of AT within the study domain. In the latter we focused on July-August, as these two months are characterized by relativity stable synoptic conditions in the study area. In addition, a temporal change in AT during the last 30years was estimated and verified using available meteorological stations and two additional remote sensing platforms. Finally, the impact of different land coverage on AT were estimated, as an example of future application of the presented approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Vineyard Yield Estimation Based on the Analysis of High Resolution Images Obtained with Artificial Illumination at Night

    Directory of Open Access Journals (Sweden)

    Davinia Font

    2015-04-01

    Full Text Available This paper presents a method for vineyard yield estimation based on the analysis of high-resolution images obtained with artificial illumination at night. First, this paper assesses different pixel-based segmentation methods in order to detect reddish grapes: threshold based, Mahalanobis distance, Bayesian classifier, linear color model segmentation and histogram segmentation, in order to obtain the best estimation of the area of the clusters of grapes in this illumination conditions. The color spaces tested were the original RGB and the Hue-Saturation-Value (HSV. The best segmentation method in the case of a non-occluded reddish table-grape variety was the threshold segmentation applied to the H layer, with an estimation error in the area of 13.55%, improved up to 10.01% by morphological filtering. Secondly, after segmentation, two procedures for yield estimation based on a previous calibration procedure have been proposed: (1 the number of pixels corresponding to a cluster of grapes is computed and converted directly into a yield estimate; and (2 the area of a cluster of grapes is converted into a volume by means of a solid of revolution, and this volume is converted into a yield estimate; the yield errors obtained were 16% and −17%, respectively.

  12. Geostatistical Investigations of Displacements on the Basis of Data from the Geodetic Monitoring of a Hydrotechnical Object

    Science.gov (United States)

    Namysłowska-Wilczyńska, Barbara; Wynalek, Janusz

    2017-12-01

    Geostatistical methods make the analysis of measurement data possible. This article presents the problems directed towards the use of geostatistics in spatial analysis of displacements based on geodetic monitoring. Using methods of applied (spatial) statistics, the research deals with interesting and current issues connected to space-time analysis, modeling displacements and deformations, as applied to any large-area objects on which geodetic monitoring is conducted (e.g., water dams, urban areas in the vicinity of deep excavations, areas at a macro-regional scale subject to anthropogenic influences caused by mining, etc.). These problems are very crucial, especially for safety assessment of important hydrotechnical constructions, as well as for modeling and estimating mining damage. Based on the geodetic monitoring data, a substantial basic empirical material was created, comprising many years of research results concerning displacements of controlled points situated on the crown and foreland of an exemplary earth dam, and used to assess the behaviour and safety of the object during its whole operating period. A research method at a macro-regional scale was applied to investigate some phenomena connected with the operation of the analysed big hydrotechnical construction. Applying a semivariogram function enabled the spatial variability analysis of displacements. Isotropic empirical semivariograms were calculated and then, theoretical parameters of analytical functions were determined, which approximated the courses of the mentioned empirical variability measure. Using ordinary (block) kriging at the grid nodes of an elementary spatial grid covering the analysed object, the values of the Z* estimated means of displacements were calculated together with the accompanying assessment of uncertainty estimation - a standard deviation of estimation σk. Raster maps of the distribution of estimated averages Z* and raster maps of deviations of estimation σk (in perspective

  13. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  14. Merging Radar Quantitative Precipitation Estimates (QPEs) from the High-resolution NEXRAD Reanalysis over CONUS with Rain-gauge Observations

    Science.gov (United States)

    Prat, O. P.; Nelson, B. R.; Stevens, S. E.; Nickl, E.; Seo, D. J.; Kim, B.; Zhang, J.; Qi, Y.

    2015-12-01

    The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor Quantitative (NMQ/Q2) based on the WSR-88D Next-generation Radar (Nexrad) network over the Continental United States (CONUS) is completed for the period covering from 2002 to 2011. While this constitutes a unique opportunity to study precipitation processes at higher resolution than conventionally possible (1-km, 5-min), the long-term radar-only product needs to be merged with in-situ information in order to be suitable for hydrological, meteorological and climatological applications. The radar-gauge merging is performed by using rain gauge information at daily (Global Historical Climatology Network-Daily: GHCN-D), hourly (Hydrometeorological Automated Data System: HADS), and 5-min (Automated Surface Observing Systems: ASOS; Climate Reference Network: CRN) resolution. The challenges related to incorporating differing resolution and quality networks to generate long-term large-scale gridded estimates of precipitation are enormous. In that perspective, we are implementing techniques for merging the rain gauge datasets and the radar-only estimates such as Inverse Distance Weighting (IDW), Simple Kriging (SK), Ordinary Kriging (OK), and Conditional Bias-Penalized Kriging (CBPK). An evaluation of the different radar-gauge merging techniques is presented and we provide an estimate of uncertainty for the gridded estimates. In addition, comparisons with a suite of lower resolution QPEs derived from ground based radar measurements (Stage IV) are provided in order to give a detailed picture of the improvements and remaining challenges.

  15. Estimation of residual stress in cold rolled iron-disks from strain measurements on the high resolution Fourier diffractometer

    International Nuclear Information System (INIS)

    Aksenov, V.L.; Balagurov, A.M.; Taran, Yu.V.

    1995-01-01

    The results of estimating residual stresses in cold rolled iron disks by measurements with the high resolution Fourier diffractometer (HRFD) at the IBR-2 pulsed reactor are presented. These measurements were made for calibration of magnetic and ultrasonic measurements carried out at the Fraunhofer-Institute for Nondestructive Testing in Saarbrucken (Germany). The tested objects were cold rolled steel disks of 2.5 mm thickness and diameter of about 500 mm used for forming small, gas pressure tanks. Neutron diffraction experiments were carried out at the scattering angle 2θ=+152 d eg with resolution Δd/d=1.5·10 -3 . The gauge volume was chosen according to the magnetic measurements lateral resolution 20x20 mm 2 . In the nearest future the neutron diffraction measurements with cold rolled iron disks at the scattering angle 2θ=±90 0 are planned. Also the texture analysis will be included in the Rietveld refinement procedure for more correct calculation of residual stress fields in the cold rolled materials. 8 refs., 10 figs., 1 tab

  16. Improved automatic optic nerve radius estimation from high resolution MRI

    Science.gov (United States)

    Harrigan, Robert L.; Smith, Alex K.; Mawn, Louise A.; Smith, Seth A.; Landman, Bennett A.

    2017-02-01

    The optic nerve (ON) is a vital structure in the human visual system and transports all visual information from the retina to the cortex for higher order processing. Due to the lack of redundancy in the visual pathway, measures of ON damage have been shown to correlate well with visual deficits. These measures are typically taken at an arbitrary anatomically defined point along the nerve and do not characterize changes along the length of the ON. We propose a fully automated, three-dimensionally consistent technique building upon a previous independent slice-wise technique to estimate the radius of the ON and surrounding cerebrospinal fluid (CSF) on high-resolution heavily T2-weighted isotropic MRI. We show that by constraining results to be three-dimensionally consistent this technique produces more anatomically viable results. We compare this technique with the previously published slice-wise technique using a short-term reproducibility data set, 10 subjects, follow-up <1 month, and show that the new method is more reproducible in the center of the ON. The center of the ON contains the most accurate imaging because it lacks confounders such as motion and frontal lobe interference. Long-term reproducibility, 5 subjects, follow-up of approximately 11 months, is also investigated with this new technique and shown to be similar to short-term reproducibility, indicating that the ON does not change substantially within 11 months. The increased accuracy of this new technique provides increased power when searching for anatomical changes in ON size amongst patient populations.

  17. High-resolution refinement of a storm loss model and estimation of return periods of loss-intensive storms over Germany

    Directory of Open Access Journals (Sweden)

    M. G. Donat

    2011-10-01

    Full Text Available A refined model for the calculation of storm losses is presented, making use of high-resolution insurance loss records for Germany and allowing loss estimates on a spatial level of administrative districts and for single storm events. Storm losses are calculated on the basis of wind speeds from both ERA-Interim and NCEP reanalyses. The loss model reproduces the spatial distribution of observed losses well by taking specific regional loss characteristics into account. This also permits high-accuracy estimates of total cumulated losses, though slightly underestimating the country-wide loss sums for storm "Kyrill", the most severe event in the insurance loss records from 1997 to 2007. A larger deviation, which is assigned to the relatively coarse resolution of the NCEP reanalysis, is only found for one specific rather small-scale event, not adequately captured by this dataset.

    The loss model is subsequently applied to the complete reanalysis period to extend the storm event catalogue to cover years when no systematic insurance records are available. This allows the consideration of loss-intensive storm events back to 1948, enlarging the event catalogue to cover the recent 60+ years, and to investigate the statistical characteristics of severe storm loss events in Germany based on a larger sample than provided by the insurance records only. Extreme value analysis is applied to the loss data to estimate the return periods of loss-intensive storms, yielding a return period for storm "Kyrill", for example, of approximately 15 to 21 years.

  18. Reservoir Characterization using geostatistical and numerical modeling in GIS with noble gas geochemistry

    Science.gov (United States)

    Vasquez, D. A.; Swift, J. N.; Tan, S.; Darrah, T. H.

    2013-12-01

    The integration of precise geochemical analyses with quantitative engineering modeling into an interactive GIS system allows for a sophisticated and efficient method of reservoir engineering and characterization. Geographic Information Systems (GIS) is utilized as an advanced technique for oil field reservoir analysis by combining field engineering and geological/geochemical spatial datasets with the available systematic modeling and mapping methods to integrate the information into a spatially correlated first-hand approach in defining surface and subsurface characteristics. Three key methods of analysis include: 1) Geostatistical modeling to create a static and volumetric 3-dimensional representation of the geological body, 2) Numerical modeling to develop a dynamic and interactive 2-dimensional model of fluid flow across the reservoir and 3) Noble gas geochemistry to further define the physical conditions, components and history of the geologic system. Results thus far include using engineering algorithms for interpolating electrical well log properties across the field (spontaneous potential, resistivity) yielding a highly accurate and high-resolution 3D model of rock properties. Results so far also include using numerical finite difference methods (crank-nicholson) to solve for equations describing the distribution of pressure across field yielding a 2D simulation model of fluid flow across reservoir. Ongoing noble gas geochemistry results will also include determination of the source, thermal maturity and the extent/style of fluid migration (connectivity, continuity and directionality). Future work will include developing an inverse engineering algorithm to model for permeability, porosity and water saturation.This combination of new and efficient technological and analytical capabilities is geared to provide a better understanding of the field geology and hydrocarbon dynamics system with applications to determine the presence of hydrocarbon pay zones (or

  19. The implementation of sea ice model on a regional high-resolution scale

    Science.gov (United States)

    Prasad, Siva; Zakharov, Igor; Bobby, Pradeep; McGuire, Peter

    2015-09-01

    The availability of high-resolution atmospheric/ocean forecast models, satellite data and access to high-performance computing clusters have provided capability to build high-resolution models for regional ice condition simulation. The paper describes the implementation of the Los Alamos sea ice model (CICE) on a regional scale at high resolution. The advantage of the model is its ability to include oceanographic parameters (e.g., currents) to provide accurate results. The sea ice simulation was performed over Baffin Bay and the Labrador Sea to retrieve important parameters such as ice concentration, thickness, ridging, and drift. Two different forcing models, one with low resolution and another with a high resolution, were used for the estimation of sensitivity of model results. Sea ice behavior over 7 years was simulated to analyze ice formation, melting, and conditions in the region. Validation was based on comparing model results with remote sensing data. The simulated ice concentration correlated well with Advanced Microwave Scanning Radiometer for EOS (AMSR-E) and Ocean and Sea Ice Satellite Application Facility (OSI-SAF) data. Visual comparison of ice thickness trends estimated from the Soil Moisture and Ocean Salinity satellite (SMOS) agreed with the simulation for year 2010-2011.

  20. First Top-Down Estimates of Anthropogenic NOx Emissions Using High-Resolution Airborne Remote Sensing Observations

    Science.gov (United States)

    Souri, Amir H.; Choi, Yunsoo; Pan, Shuai; Curci, Gabriele; Nowlan, Caroline R.; Janz, Scott J.; Kowalewski, Matthew G.; Liu, Junjie; Herman, Jay R.; Weinheimer, Andrew J.

    2018-03-01

    A number of satellite-based instruments have become an essential part of monitoring emissions. Despite sound theoretical inversion techniques, the insufficient samples and the footprint size of current observations have introduced an obstacle to narrow the inversion window for regional models. These key limitations can be partially resolved by a set of modest high-quality measurements from airborne remote sensing. This study illustrates the feasibility of nitrogen dioxide (NO2) columns from the Geostationary Coastal and Air Pollution Events Airborne Simulator (GCAS) to constrain anthropogenic NOx emissions in the Houston-Galveston-Brazoria area. We convert slant column densities to vertical columns using a radiative transfer model with (i) NO2 profiles from a high-resolution regional model (1 × 1 km2) constrained by P-3B aircraft measurements, (ii) the consideration of aerosol optical thickness impacts on radiance at NO2 absorption line, and (iii) high-resolution surface albedo constrained by ground-based spectrometers. We characterize errors in the GCAS NO2 columns by comparing them to Pandora measurements and find a striking correlation (r > 0.74) with an uncertainty of 3.5 × 1015 molecules cm-2. On 9 of 10 total days, the constrained anthropogenic emissions by a Kalman filter yield an overall 2-50% reduction in polluted areas, partly counterbalancing the well-documented positive bias of the model. The inversion, however, boosts emissions by 94% in the same areas on a day when an unprecedented local emissions event potentially occurred, significantly mitigating the bias of the model. The capability of GCAS at detecting such an event ensures the significance of forthcoming geostationary satellites for timely estimates of top-down emissions.

  1. Deep fracturing of granite bodies. Literature survey, geostructural and geostatistic investigations

    International Nuclear Information System (INIS)

    Bles, J.L.; Blanchin, R.

    1986-01-01

    This report deals with investigations about deep fracturing of granite bodies, which were performed within two cost-sharing contracts between the Commission of the European Communities, the Commissariat a l'Energie Atomique and the Bureau de Recherches Geologiques et Minieres. The aim of this work was to study the evolution of fracturing in granite from the surface to larger depths, so that guidelines can be identified in order to extrapolate, at depth, the data obtained from surface investigations. These guidelines could eventually be used for feasibility studies about radioactive waste disposal. The results of structural and geostatistic investigations about the St. Sylvestre granite, as well as the literature survey about fractures encountered in two long Alpine galleries (Mont-Blanc tunnel and Arc-Isere water gallery), in the 1000 m deep borehole at Auriat, and in the Bassies granite body (Pyrenees) are presented. These results show that, for radioactive waste disposal feasibility studies: 1. The deep state of fracturing in a granite body can be estimated from results obtained at the surface; 2. Studying only the large fault network would be insufficient, both for surface investigations and for studies in deep boreholes and/or in underground galleries; 3. It is necessary to study orientations and frequencies of small fractures, so that structural mapping and statistical/geostatistical methods can be used in order to identify zones of higher and lower fracturing

  2. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...

  3. Satellite Magnetic Residuals Investigated With Geostatistical Methods

    DEFF Research Database (Denmark)

    Fox Maule, Chaterine; Mosegaard, Klaus; Olsen, Nils

    2005-01-01

    (which consists of measurement errors and unmodeled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyze the residuals of the Oersted (09d/04) field model (www.dsri.dk/Oersted/Field models/IGRF 2005 candidates/), which is based...

  4. High-resolution X-ray television and high-resolution video recorders

    International Nuclear Information System (INIS)

    Haendle, J.; Horbaschek, H.; Alexandrescu, M.

    1977-01-01

    The improved transmission properties of the high-resolution X-ray television chain described here make it possible to transmit more information per television image. The resolution in the fluoroscopic image, which is visually determined, depends on the dose rate and the inertia of the television pick-up tube. This connection is discussed. In the last few years, video recorders have been increasingly used in X-ray diagnostics. The video recorder is a further quality-limiting element in X-ray television. The development of function patterns of high-resolution magnetic video recorders shows that this quality drop may be largely overcome. The influence of electrical band width and number of lines on the resolution in the X-ray television image stored is explained in more detail. (orig.) [de

  5. Characterizing the spatial structure of endangered species habitat using geostatistical analysis of IKONOS imagery

    Science.gov (United States)

    Wallace, C.S.A.; Marsh, S.E.

    2005-01-01

    Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.

  6. Effective property determination for input to a geostatistical model of regional groundwater flow: Wellenberg T→K

    International Nuclear Information System (INIS)

    Lanyon, G.W.; Marschall, P.; Vomvoris, S.; Jaquet, O.; Mazurek, M.

    1998-01-01

    This paper describes the methodology used to estimate effective hydraulic properties for input into a regional geostatistical model of groundwater flow at the Wellenberg site in Switzerland. The methodology uses a geologically-based discrete fracture network model to calculate effective hydraulic properties for 100m blocks along each borehole. A description of the most transmissive features (Water Conducting Features or WCFs) in each borehole is used to determine local transmissivity distributions which are combined with descriptions of WCF extent, orientation and channelling to create fracture network models. WCF geometry is dependent on the class of WCF. WCF classes are defined for each type of geological structure associated with identified borehole inflows. Local to each borehole, models are conditioned on the observed transmissivity and occurrence of WCFs. Multiple realisations are calculated for each 100m block over approximately 400m of borehole. The results from the numerical upscaling are compared with conservative estimates of hydraulic conductivity. Results from unconditioned models are also compared to identify the consequences of conditioning and interval of boreholes that appear to be atypical. An inverse method is also described by which realisations of the geostatistical model can be used to condition discrete fracture network models away from the boreholes. The method can be used as a verification of the modelling approach by prediction of data at borehole locations. Applications of the models to estimation of post-closure repository performance, including cavern inflow and seal zone modelling, are illustrated

  7. Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding

    Science.gov (United States)

    Dung, Lan-Rong; Lin, Meng-Chun

    This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.

  8. New DOI identification approach for high-resolution PET detectors

    International Nuclear Information System (INIS)

    Choghadi, Amin; Takahashi, Hiroyuki; Shimazoe, Kenji

    2016-01-01

    Depth-of-interaction (DOI) Identification in positron emission tomography (PET) detectors is getting importance as it improves spatial resolution in both conventional and time-of-flight (TOF) PET, and coincidence time resolution (CTR) in TOF-PET. In both prototypes, spatial resolution is affected by parallax error caused by length of scintillator crystals. This long length also contributes substantial timing uncertainty to the time resolution of TOF-PET. Through DOI identification, both parallax error and the timing uncertainty caused by the length of crystal can be resolved. In this work, a novel approach to estimate DOI was investigated, enjoying the interference of absorbance spectrum of scintillator crystals with their emission spectrum. Because the absorption length is close to zero for shorter wavelengths of crystal emission spectrum, the counts in this range of spectrum highly depend on DOI; that is, higher counts corresponds to deeper interactions. The ratio of counts in this range to the total counts is a good measure to estimate DOI. In order to extract such ratio, two photodetectors for each crystal are used and an optical filter is mounted only on top of one of them. The ratio of filtered output to non-filtered output can be utilized as DOI estimator. For a 2×2×20 mm 3 GAGG:Ce scintillator, 8-mm DOI resolution achieved in our simulations. (author)

  9. Geostatistical estimates of future recharge for the Death Valley region

    International Nuclear Information System (INIS)

    Hevesi, J.A.; Flint, A.L.

    1998-01-01

    Spatially distributed estimates of regional ground water recharge rates under both current and potential future climates are needed to evaluate a potential geologic repository for high-level nuclear waste at Yucca Mountain, Nevada, which is located within the Death Valley ground-water region (DVGWR). Determining the spatial distribution of recharge is important for regional saturated-zone ground-water flow models. In the southern Nevada region, the Maxey-Eakin method has been used for estimating recharge based on average annual precipitation. Although this method does not directly account for a variety of location-specific factors which control recharge (such as bedrock permeability, soil cover, and net radiation), precipitation is the primary factor that controls in the region. Estimates of recharge obtained by using the Maxey-Eakin method are comparable to estimates of recharge obtained by using chloride balance studies. The authors consider the Maxey-Eakin approach as a relatively simple method of obtaining preliminary estimates of recharge on a regional scale

  10. Geostatistical characterization of soil pollution at industrial sites Case of polycyclic aromatic hydrocarbons at former coking plants; Caracterisation geostatistique de pollutions industrielles de sols cas des hydrocarbures aromatiques polycycliques sur d'anciens sites de cokeries

    Energy Technology Data Exchange (ETDEWEB)

    Jeannee, N

    2001-05-15

    Estimating polycyclic aromatic hydrocarbons concentrations in soil at former industrial sites poses several practical problems on account of the properties of the contaminants and the history of site: 1)collection and preparation of samples from highly heterogeneous material, 2) high short scale variability, particularly in presence of backfill, 3) highly contrasted grades making the vario-gram inference complicated. The sampling strategy generally adopted for contaminated sites is based on the historical information. Systematic sampling recommended for geostatistical estimation is often considered to be excessive and unnecessary. Two former coking plants are used as test cases for comparing several geostatistical methods for estimating (i) in situ concentrations and (ii) the probability that they are above a pollution threshold. Several practical and methodological questions are considered: 1) the properties of various estimators of the experimental vario-gram and the validity of the results; 2) the use of soft data, such as historical information, organoleptic observations and semi-quantitative methods, with a view to improve the precision of the estimates; 3) the comparison of standard sampling strategies, taking into account vertical repartition of grades and the history of the site. Multiple analyses of the same sample give an approximation of the sampling error. Short scale sampling shows the difficulty of selecting soils in the absence of a spatial structure. Sensitivity studies are carried out to assess how densely sampled soft data can improve estimates. By using mainly existing models, this work aims at giving practical recommendations for the characterization of soil pollution. (author)

  11. The use of sequential indicator simulation to characterize geostatistical uncertainty

    International Nuclear Information System (INIS)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds

  12. High-resolution mapping of European fishing pressure on the benthic habitats

    DEFF Research Database (Denmark)

    Eigaard, Ole Ritzau; Bastardie, Francois; Hintzen, Niels T.

    effort. Consequently, most logbook information is not well suited for quantitative estimation of seafloor impact (swept area and impact severity) of the different gears and trips. We present a method to overcome this information deficiency of official statistics and develop high-resolution large......) and gear width estimates were assigned to individual interpolated vessel tracks based on VMS data. The outcome was European wide highresolution fishing intensity maps (total yearly swept area within grid cells of 1*1 minutes longitude and latitude) for 2010, 2011 and 2012. Finally the high-resolution...... fishing pressure maps were overlaid with existing marine habitat maps to identify areas of potential ecosystem service conflicts...

  13. High-Resolution Near Real-Time Drought Monitoring in South Asia

    Science.gov (United States)

    Aadhar, S.; Mishra, V.

    2017-12-01

    Drought in South Asia affect food and water security and pose challenges for millions of people. For policy-making, planning and management of water resources at the sub-basin or administrative levels, high-resolution datasets of precipitation and air temperature are required in near-real time. Here we develop a high resolution (0.05 degree) bias-corrected precipitation and temperature data that can be used to monitor near real-time drought conditions over South Asia. Moreover, the dataset can be used to monitor climatic extremes (heat waves, cold waves, dry and wet anomalies) in South Asia. A distribution mapping method was applied to correct bias in precipitation and air temperature (maximum and minimum), which performed well compared to the other bias correction method based on linear scaling. Bias-corrected precipitation and temperature data were used to estimate Standardized precipitation index (SPI) and Standardized Precipitation Evapotranspiration Index (SPEI) to assess the historical and current drought conditions in South Asia. We evaluated drought severity and extent against the satellite-based Normalized Difference Vegetation Index (NDVI) anomalies and satellite-driven Drought Severity Index (DSI) at 0.05˚. We find that the bias-corrected high-resolution data can effectively capture observed drought conditions as shown by the satellite-based drought estimates. High resolution near real-time dataset can provide valuable information for decision-making at district and sub- basin levels.

  14. Massively Parallel Geostatistical Inversion of Coupled Processes in Heterogeneous Porous Media

    Science.gov (United States)

    Ngo, A.; Schwede, R. L.; Li, W.; Bastian, P.; Ippisch, O.; Cirpka, O. A.

    2012-04-01

    The quasi-linear geostatistical approach is an inversion scheme that can be used to estimate the spatial distribution of a heterogeneous hydraulic conductivity field. The estimated parameter field is considered to be a random variable that varies continuously in space, meets the measurements of dependent quantities (such as the hydraulic head, the concentration of a transported solute or its arrival time) and shows the required spatial correlation (described by certain variogram models). This is a method of conditioning a parameter field to observations. Upon discretization, this results in as many parameters as elements of the computational grid. For a full three dimensional representation of the heterogeneous subsurface it is hardly sufficient to work with resolutions (up to one million parameters) of the model domain that can be achieved on a serial computer. The forward problems to be solved within the inversion procedure consists of the elliptic steady-state groundwater flow equation and the formally elliptic but nearly hyperbolic steady-state advection-dominated solute transport equation in a heterogeneous porous medium. Both equations are discretized by Finite Element Methods (FEM) using fully scalable domain decomposition techniques. Whereas standard conforming FEM is sufficient for the flow equation, for the advection dominated transport equation, which rises well known numerical difficulties at sharp fronts or boundary layers, we use the streamline diffusion approach. The arising linear systems are solved using efficient iterative solvers with an AMG (algebraic multigrid) pre-conditioner. During each iteration step of the inversion scheme one needs to solve a multitude of forward and adjoint problems in order to calculate the sensitivities of each measurement and the related cross-covariance matrix of the unknown parameters and the observations. In order to reduce interprocess communications and to improve the scalability of the code on larger clusters

  15. High Resolution Insights into Snow Distribution Provided by Drone Photogrammetry

    Science.gov (United States)

    Redpath, T.; Sirguey, P. J.; Cullen, N. J.; Fitzsimons, S.

    2017-12-01

    Dynamic in time and space, New Zealand's seasonal snow is largely confined to remote alpine areas, complicating ongoing in situ measurement and characterisation. Improved understanding and modeling of the seasonal snowpack requires fine scale resolution of snow distribution and spatial variability. The potential of remotely piloted aircraft system (RPAS) photogrammetry to resolve spatial and temporal variability of snow depth and water equivalent in a New Zealand alpine catchment is assessed in the Pisa Range, Central Otago. This approach yielded orthophotomosaics and digital surface models (DSM) at 0.05 and 0.15 m spatial resolution, respectively. An autumn reference DSM allowed mapping of winter (02/08/2016) and spring (10/09/2016) snow depth at 0.15 m spatial resolution, via DSM differencing. The consistency and accuracy of the RPAS-derived surface was assessed by comparison of snow-free regions of the spring and autumn DSMs, while accuracy of RPAS retrieved snow depth was assessed with 86 in situ snow probe measurements. Results show a mean vertical residual of 0.024 m between DSMs acquired in autumn and spring. This residual approximated a Laplace distribution, reflecting the influence of large outliers on the small overall bias. Propagation of errors associated with successive DSMs saw snow depth mapped with an accuracy of ± 0.09 m (95% c.l.). Comparing RPAS and in situ snow depth measurements revealed the influence of geo-location uncertainty and interactions between vegetation and the snowpack on snow depth uncertainty and bias. Semi-variogram analysis revealed that the RPAS outperformed systematic in situ measurements in resolving fine scale spatial variability. Despite limitations accompanying RPAS photogrammetry, this study demonstrates a repeatable means of accurately mapping snow depth for an entire, yet relatively small, hydrological basin ( 0.5 km2), at high resolution. Resolving snowpack features associated with re-distribution and preferential

  16. ROBUST MOTION SEGMENTATION FOR HIGH DEFINITION VIDEO SEQUENCES USING A FAST MULTI-RESOLUTION MOTION ESTIMATION BASED ON SPATIO-TEMPORAL TUBES

    OpenAIRE

    Brouard , Olivier; Delannay , Fabrice; Ricordel , Vincent; Barba , Dominique

    2007-01-01

    4 pages; International audience; Motion segmentation methods are effective for tracking video objects. However, objects segmentation methods based on motion need to know the global motion of the video in order to back-compensate it before computing the segmentation. In this paper, we propose a method which estimates the global motion of a High Definition (HD) video shot and then segments it using the remaining motion information. First, we develop a fast method for multi-resolution motion est...

  17. Real-time underwater object detection based on an electrically scanned high-resolution sonar

    DEFF Research Database (Denmark)

    Henriksen, Lars

    1994-01-01

    The paper describes an approach to real time detection and tracking of underwater objects, using image sequences from an electrically scanned high-resolution sonar. The use of a high resolution sonar provides a good estimate of the location of the objects, but strains the computers on board, beca...

  18. Geostatistical characterization of soil pollution at industrial sites Case of polycyclic aromatic hydrocarbons at former coking plants; Caracterisation geostatistique de pollutions industrielles de sols cas des hydrocarbures aromatiques polycycliques sur d'anciens sites de cokeries

    Energy Technology Data Exchange (ETDEWEB)

    Jeannee, N.

    2001-05-15

    Estimating polycyclic aromatic hydrocarbons concentrations in soil at former industrial sites poses several practical problems on account of the properties of the contaminants and the history of site: 1)collection and preparation of samples from highly heterogeneous material, 2) high short scale variability, particularly in presence of backfill, 3) highly contrasted grades making the vario-gram inference complicated. The sampling strategy generally adopted for contaminated sites is based on the historical information. Systematic sampling recommended for geostatistical estimation is often considered to be excessive and unnecessary. Two former coking plants are used as test cases for comparing several geostatistical methods for estimating (i) in situ concentrations and (ii) the probability that they are above a pollution threshold. Several practical and methodological questions are considered: 1) the properties of various estimators of the experimental vario-gram and the validity of the results; 2) the use of soft data, such as historical information, organoleptic observations and semi-quantitative methods, with a view to improve the precision of the estimates; 3) the comparison of standard sampling strategies, taking into account vertical repartition of grades and the history of the site. Multiple analyses of the same sample give an approximation of the sampling error. Short scale sampling shows the difficulty of selecting soils in the absence of a spatial structure. Sensitivity studies are carried out to assess how densely sampled soft data can improve estimates. By using mainly existing models, this work aims at giving practical recommendations for the characterization of soil pollution. (author)

  19. Ecosystem services - from assessements of estimations to quantitative, validated, high-resolution, continental-scale mapping via airborne LIDAR

    Science.gov (United States)

    Zlinszky, András; Pfeifer, Norbert

    2016-04-01

    service potential" which is the ability of the local ecosystem to deliver various functions (water retention, carbon storage etc.), but can't quantify how much of these are actually used by humans or what the estimated monetary value is. Due to its ability to measure both terrain relief and vegetation structure in high resolution, airborne LIDAR supports direct quantification of the properties of an ecosystem that lead to it delivering a given service (such as biomass, water retention, micro-climate regulation or habitat diversity). In addition, its high resolution allows direct calibration with field measurements: routine harvesting-based ecological measurements, local biodiversity indicator surveys or microclimate recordings all take place at the human scale and can be directly linked to the local value of LIDAR-based indicators at meter resolution. Therefore, if some field measurements with standard ecological methods are performed on site, the accuracy of LIDAR-based ecosystem service indicators can be rigorously validated. With this conceptual and technical approach high resolution ecosystem service assessments can be made with well established credibility. These would consolidate the concept of ecosystem services and support both scientific research and evidence-based environmental policy at local and - as data coverage is continually increasing - continental scale.

  20. A test for Improvement of high resolution Quantitative Precipitation Estimation for localized heavy precipitation events

    Science.gov (United States)

    Lee, Jung-Hoon; Roh, Joon-Woo; Park, Jeong-Gyun

    2017-04-01

    Accurate estimation of precipitation is one of the most difficult and significant tasks in the area of weather diagnostic and forecasting. In the Korean Peninsula, heavy precipitations are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. In addition to, various geographical and topographical elements make production of temporal and spatial distribution of precipitation is very complicated. Especially, localized heavy rainfall events in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances. In weather radar data with high temporal and spatial resolution, accurate estimation of rain rate from radar reflectivity data is too difficult. Z-R relationship (Marshal and Palmer 1948) have adapted representatively. In addition to, several methods such as support vector machine (SVM), neural network, Fuzzy logic, Kriging were utilized in order to improve the accuracy of rain rate. These methods show the different quantitative precipitation estimation (QPE) and the performances of accuracy are different for heavy precipitation cases. In this study, in order to improve the accuracy of QPE for localized heavy precipitation, ensemble method for Z-R relationship and various techniques was tested. This QPE ensemble method was developed by a concept based on utilizing each advantage of precipitation calibration methods. The ensemble members were produced for a combination of different Z-R coefficient and calibration method.

  1. Risk mapping of clonorchiasis in the People's Republic of China: A systematic review and Bayesian geostatistical analysis.

    Directory of Open Access Journals (Sweden)

    Ying-Si Lai

    2017-03-01

    Full Text Available Clonorchiasis, one of the most important food-borne trematodiases, affects more than 12 million people in the People's Republic of China (P.R. China. Spatially explicit risk estimates of Clonorchis sinensis infection are needed in order to target control interventions.Georeferenced survey data pertaining to infection prevalence of C. sinensis in P.R. China from 2000 onwards were obtained via a systematic review in PubMed, ISI Web of Science, Chinese National Knowledge Internet, and Wanfang Data from January 1, 2000 until January 10, 2016, with no restriction of language or study design. Additional disease data were provided by the National Institute of Parasitic Diseases, Chinese Center for Diseases Control and Prevention in Shanghai. Environmental and socioeconomic proxies were extracted from remote-sensing and other data sources. Bayesian variable selection was carried out to identify the most important predictors of C. sinensis risk. Geostatistical models were applied to quantify the association between infection risk and the predictors of the disease, and to predict the risk of infection across P.R. China at high spatial resolution (over a grid with grid cell size of 5×5 km.We obtained clonorchiasis survey data at 633 unique locations in P.R. China. We observed that the risk of C. sinensis infection increased over time, particularly from 2005 onwards. We estimate that around 14.8 million (95% Bayesian credible interval 13.8-15.8 million people in P.R. China were infected with C. sinensis in 2010. Highly endemic areas (≥ 20% were concentrated in southern and northeastern parts of the country. The provinces with the highest risk of infection and the largest number of infected people were Guangdong, Guangxi, and Heilongjiang.Our results provide spatially relevant information for guiding clonorchiasis control interventions in P.R. China. The trend toward higher risk of C. sinensis infection in the recent past urges the Chinese government to

  2. Towards breaking the spatial resolution barriers: An optical flow and super-resolution approach for sea ice motion estimation

    Science.gov (United States)

    Petrou, Zisis I.; Xian, Yang; Tian, YingLi

    2018-04-01

    Estimation of sea ice motion at fine scales is important for a number of regional and local level applications, including modeling of sea ice distribution, ocean-atmosphere and climate dynamics, as well as safe navigation and sea operations. In this study, we propose an optical flow and super-resolution approach to accurately estimate motion from remote sensing images at a higher spatial resolution than the original data. First, an external example learning-based super-resolution method is applied on the original images to generate higher resolution versions. Then, an optical flow approach is applied on the higher resolution images, identifying sparse correspondences and interpolating them to extract a dense motion vector field with continuous values and subpixel accuracies. Our proposed approach is successfully evaluated on passive microwave, optical, and Synthetic Aperture Radar data, proving appropriate for multi-sensor applications and different spatial resolutions. The approach estimates motion with similar or higher accuracy than the original data, while increasing the spatial resolution of up to eight times. In addition, the adopted optical flow component outperforms a state-of-the-art pattern matching method. Overall, the proposed approach results in accurate motion vectors with unprecedented spatial resolutions of up to 1.5 km for passive microwave data covering the entire Arctic and 20 m for radar data, and proves promising for numerous scientific and operational applications.

  3. Estimating Hydraulic Resistance for Floodplain Mapping and Hydraulic Studies from High-Resolution Topography: Physical and Numerical Simulations

    Science.gov (United States)

    Minear, J. T.

    2017-12-01

    One of the primary unknown variables in hydraulic analyses is hydraulic resistance, values for which are typically set using broad assumptions or calibration, with very few methods available for independent and robust determination. A better understanding of hydraulic resistance would be highly useful for understanding floodplain processes, forecasting floods, advancing sediment transport and hydraulic coupling, and improving higher dimensional flood modeling (2D+), as well as correctly calculating flood discharges for floods that are not directly measured. The relationship of observed features to hydraulic resistance is difficult to objectively quantify in the field, partially because resistance occurs at a variety of scales (i.e. grain, unit and reach) and because individual resistance elements, such as trees, grass and sediment grains, are inherently difficult to measure. Similar to photogrammetric techniques, Terrestrial Laser Scanning (TLS, also known as Ground-based LiDAR) has shown great ability to rapidly collect high-resolution topographic datasets for geomorphic and hydrodynamic studies and could be used to objectively quantify the features that collectively create hydraulic resistance in the field. Because of its speed in data collection and remote sensing ability, TLS can be used both for pre-flood and post-flood studies that require relatively quick response in relatively dangerous settings. Using datasets collected from experimental flume runs and numerical simulations, as well as field studies of several rivers in California and post-flood rivers in Colorado, this study evaluates the use of high-resolution topography to estimate hydraulic resistance, particularly from grain-scale elements. Contrary to conventional practice, experimental laboratory runs with bed grain size held constant but with varying grain-scale protusion create a nearly twenty-fold variation in measured hydraulic resistance. The ideal application of this high-resolution topography

  4. Constrained optimisation of spatial sampling : a geostatistical approach

    NARCIS (Netherlands)

    Groenigen, van J.W.

    1999-01-01

    Aims

    This thesis aims at the development of optimal sampling strategies for geostatistical studies. Special emphasis is on the optimal use of ancillary data, such as co-related imagery, preliminary observations and historic knowledge. Although the object of all studies

  5. Estimation of Evapotraspiration of Tamarisk using Energy Balance Models with High Resolution Airborne Imagery and LIDAR Data

    Science.gov (United States)

    Geli, H. M.; Taghvaeian, S.; Neale, C. M.; Pack, R.; Watts, D. R.; Osterberg, J.

    2010-12-01

    The wide uncontrolled spread of the invasive species of Tamarisk (Salt Cedar) in the riparian areas of the southwest of the United States has become a source of concern to the water resource management community. This tree which was imported for ornamental purposes and to control bank erosion during the 1800’s later became problematic and unwanted due to its biophysical properties: Its vigorous growth out-competes native species for moisture, lowering water tables, increasing the soil salinity and hence becomes the dominant riparian vegetation especially over arid to semi-arid floodplain environments. Most importantly they consume large amounts of water leading to reduction of river flows and lowering the groundwater table. We implemented this study in an effort to provide reliable estimates of the amount of water consumed or “lost” by such species through evapotranspiration (ET) as well as to a better understand of the related land surface and near atmosphere interactions. The recent advances in remote sensing techniques and the related data quality made it possible to provide spatio-temporal estimates of ET at a considerably higher resolution and reliable accuracy over a wide range of surface heterogeneity. We tested two different soil-vegetation atmosphere transfer models (SVAT) that are based on thermal remote sensing namely: the two source model (TSM) of Norman et al. (1995) with its recent modifications and the Surface Energy balance algorithm (SEBAL) of Bastiaanssen et al. (1998) to estimate the different surface energy balance components and the evapotranspiration (ET) spatially. We used high resolution (1.0 meter pixel size) shortwave reflectance and longwave thermal airborne imagery acquired by the research aircraft at the Remote Sensing Services Lab at Utah State University (USU) and land use map classified from these images as well as a detailed vegetation height image acquired by the LASSI Lidar also developed at USU. We also compared estimates

  6. Geostatistical interpolation for modelling SPT data in northern Izmir

    Indian Academy of Sciences (India)

    data scatter' stems from the natural randomness of the system under con- ... Geostatistical methods were originally used for ore reserve calculations by the ... ing grain size distribution, plasticity, strength parameters and water content, for ...

  7. Application of covariance clouds for estimating the anisotropy ellipsoid eigenvectors, with case study in uranium deposit

    International Nuclear Information System (INIS)

    Jamali Esfahlan, D.; Madani, H.; Tahmaseb Nazemi, M. T.; Mahdavi, F.; Ghaderi, M. R.; Najafi, M.

    2010-01-01

    Various methods of Kriging and nonlinear geostatistical methods considered as acceptable methods for resource and reserve estimations have characters such as the least estimation variance in their nature, and accurate results in the acceptable confidence levels range could be achieved if the required parameters for the estimation are determined accurately. If the determined parameters don't have the sufficient accuracy, 3-D geostatistical estimations will not be reliable any more, and by this, all the quantitative parameters of the mineral deposit (e.g. grade-tonnage variations) will be misinterpreted. One of the most significant parameters for 3-D geostatistical estimation is the anisotropy ellipsoid. The anisotropy ellipsoid is important for geostatistical estimations because it determines the samples in different directions required for accomplishing the estimation. The aim of this paper is to illustrate a more simple and time preserving analytical method that can apply geophysical or geochemical analysis data from the core-length of boreholes for modeling the anisotropy ellipsoid. By this method which is based on the distribution of covariance clouds in a 3-D sampling space of a deposit, quantities, ratios, azimuth and plunge of the major-axis, semi-major axis and the minor-axis determine the ore-grade continuity within the deposit and finally the anisotropy ellipsoid of the deposit will be constructed. A case study of an uranium deposit is also analytically discussed for illustrating the application of this method.

  8. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    International Nuclear Information System (INIS)

    Li Yupeng; Deutsch, Clayton V.

    2012-01-01

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells. In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.

  9. Evaluating a Local Ensemble Transform Kalman Filter snow cover data assimilation method to estimate SWE within a high-resolution hydrologic modeling framework across Western US mountainous regions

    Science.gov (United States)

    Oaida, C. M.; Andreadis, K.; Reager, J. T., II; Famiglietti, J. S.; Levoe, S.

    2017-12-01

    Accurately estimating how much snow water equivalent (SWE) is stored in mountainous regions characterized by complex terrain and snowmelt-driven hydrologic cycles is not only greatly desirable, but also a big challenge. Mountain snowpack exhibits high spatial variability across a broad range of spatial and temporal scales due to a multitude of physical and climatic factors, making it difficult to observe or estimate in its entirety. Combing remotely sensed data and high resolution hydrologic modeling through data assimilation (DA) has the potential to provide a spatially and temporally continuous SWE dataset at horizontal scales that capture sub-grid snow spatial variability and are also relevant to stakeholders such as water resource managers. Here, we present the evaluation of a new snow DA approach that uses a Local Ensemble Transform Kalman Filter (LETKF) in tandem with the Variable Infiltration Capacity macro-scale hydrologic model across the Western United States, at a daily temporal resolution, and a horizontal resolution of 1.75 km x 1.75 km. The LETKF is chosen for its relative simplicity, ease of implementation, and computational efficiency and scalability. The modeling/DA system assimilates daily MODIS Snow Covered Area and Grain Size (MODSCAG) fractional snow cover over, and has been developed to efficiently calculate SWE estimates over extended periods of time and covering large regional-scale areas at relatively high spatial resolution, ultimately producing a snow reanalysis-type dataset. Here we focus on the assessment of SWE produced by the DA scheme over several basins in California's Sierra Nevada Mountain range where Airborne Snow Observatory data is available, during the last five water years (2013-2017), which include both one of the driest and one of the wettest years. Comparison against such a spatially distributed SWE observational product provides a greater understanding of the model's ability to estimate SWE and SWE spatial variability

  10. DMPDS: A Fast Motion Estimation Algorithm Targeting High Resolution Videos and Its FPGA Implementation

    Directory of Open Access Journals (Sweden)

    Gustavo Sanchez

    2012-01-01

    Full Text Available This paper presents a new fast motion estimation (ME algorithm targeting high resolution digital videos and its efficient hardware architecture design. The new Dynamic Multipoint Diamond Search (DMPDS algorithm is a fast algorithm which increases the ME quality when compared with other fast ME algorithms. The DMPDS achieves a better digital video quality reducing the occurrence of local minima falls, especially in high definition videos. The quality results show that the DMPDS is able to reach an average PSNR gain of 1.85 dB when compared with the well-known Diamond Search (DS algorithm. When compared to the optimum results generated by the Full Search (FS algorithm the DMPDS shows a lose of only 1.03 dB in the PSNR. On the other hand, the DMPDS reached a complexity reduction higher than 45 times when compared to FS. The quality gains related to DS caused an expected increase in the DMPDS complexity which uses 6.4-times more calculations than DS. The DMPDS architecture was designed focused on high performance and low cost, targeting to process Quad Full High Definition (QFHD videos in real time (30 frames per second. The architecture was described in VHDL and synthesized to Altera Stratix 4 and Xilinx Virtex 5 FPGAs. The synthesis results show that the architecture is able to achieve processing rates higher than 53 QFHD fps, reaching the real-time requirements. The DMPDS architecture achieved the highest processing rate when compared to related works in the literature. This high processing rate was obtained designing an architecture with a high operation frequency and low numbers of cycles necessary to process each block.

  11. ANL high resolution injector

    International Nuclear Information System (INIS)

    Minehara, E.; Kutschera, W.; Hartog, P.D.; Billquist, P.

    1985-01-01

    The ANL (Argonne National Laboratory) high-resolution injector has been installed to obtain higher mass resolution and higher preacceleration, and to utilize effectively the full mass range of ATLAS (Argonne Tandem Linac Accelerator System). Preliminary results of the first beam test are reported briefly. The design and performance, in particular a high-mass-resolution magnet with aberration compensation, are discussed. 7 refs., 5 figs., 2 tabs

  12. Benchmarking a geostatistical procedure for the homogenisation of annual precipitation series

    Science.gov (United States)

    Caineta, Júlio; Ribeiro, Sara; Henriques, Roberto; Soares, Amílcar; Costa, Ana Cristina

    2014-05-01

    The European project COST Action ES0601, Advances in homogenisation methods of climate series: an integrated approach (HOME), has brought to attention the importance of establishing reliable homogenisation methods for climate data. In order to achieve that, a benchmark data set, containing monthly and daily temperature and precipitation data, was created to be used as a comparison basis for the effectiveness of those methods. Several contributions were submitted and evaluated by a number of performance metrics, validating the results against realistic inhomogeneous data. HOME also led to the development of new homogenisation software packages, which included feedback and lessons learned during the project. Preliminary studies have suggested a geostatistical stochastic approach, which uses Direct Sequential Simulation (DSS), as a promising methodology for the homogenisation of precipitation data series. Based on the spatial and temporal correlation between the neighbouring stations, DSS calculates local probability density functions at a candidate station to detect inhomogeneities. The purpose of the current study is to test and compare this geostatistical approach with the methods previously presented in the HOME project, using surrogate precipitation series from the HOME benchmark data set. The benchmark data set contains monthly precipitation surrogate series, from which annual precipitation data series were derived. These annual precipitation series were subject to exploratory analysis and to a thorough variography study. The geostatistical approach was then applied to the data set, based on different scenarios for the spatial continuity. Implementing this procedure also promoted the development of a computer program that aims to assist on the homogenisation of climate data, while minimising user interaction. Finally, in order to compare the effectiveness of this methodology with the homogenisation methods submitted during the HOME project, the obtained results

  13. On the optical stability of high-resolution transmission electron microscopes

    International Nuclear Information System (INIS)

    Barthel, J.; Thust, A.

    2013-01-01

    In the recent two decades the technique of high-resolution transmission electron microscopy experienced an unprecedented progress through the introduction of hardware aberration correctors and by the improvement of the achievable resolution to the sub-Ångström level. The important aspect that aberration correction at a given resolution requires also a well defined amount of optical stability has received little attention so far. Therefore we investigate the qualification of a variety of high-resolution electron microscopes to maintain an aberration corrected optical state in terms of an optical lifetime. We develop a comprehensive statistical framework for the estimation of the optical lifetime and find remarkably low values between tens of seconds and a couple of minutes. Probability curves are introduced, which inform the operator about the chance to work still in the fully aberration corrected state. - Highlights: • We investigate the temporal stability of optical aberrations in HRTEM. • We develop a statistical framework for the estimation of optical lifetimes. • We introduce plots showing the success probability for aberration-free work. • Optical lifetimes in sub-Ångström electron microscopy are surprisingly low. • The success of aberration correction depends strongly on the optical stability

  14. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    Science.gov (United States)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  15. Geostatistical mapping of Cs-137 contamination depth in building structures by integrating ISOCS measurements of different spatial supports

    Energy Technology Data Exchange (ETDEWEB)

    Boden, S.; Jacques, D. [Institute for Environment, Health and Safety, Belgian Nuclear Research Centre (SCK-CEN), Boeretang 200, BE-2400, Mol (Belgium); Rogiers, B. [Dept. of Earth and Environmental Sciences, KU Leuven - University of Leuven Celestijnenlaan 200e - bus 2410, BE-3001, Leuven (Belgium)

    2013-07-01

    Reliable methods to determine the contamination depth in nuclear building structures are very much needed for minimizing the radioactive waste volume and the decontamination workload. This paper investigates the geostatistical integration of in situ gamma-ray spectroscopy measurements of different spatial supports. A case study is presented from the BR3 decommissioning project, yielding an estimated reduction of waste volume of ∼35%, and recommendations are made for future application of the proposed methodology. (authors)

  16. Novel high resolution tactile robotic fingertips

    DEFF Research Database (Denmark)

    Drimus, Alin; Jankovics, Vince; Gorsic, Matija

    2014-01-01

    This paper describes a novel robotic fingertip based on piezoresistive rubber that can sense pressure tactile stimuli with a high spatial resolution over curved surfaces. The working principle is based on a three-layer sandwich structure (conductive electrodes on top and bottom and piezoresistive...... with specialized data acquisition electronics that acquire 500 frames per second provides rich information regarding contact force, shape and angle for bio- inspired robotic fingertips. Furthermore, a model of estimating the force of contact based on values of the cells is proposed....

  17. Spatio-temporal interpolation of daily temperatures for global land areas at 1 km resolution

    NARCIS (Netherlands)

    Kilibarda, M.; Hengl, T.; Heuvelink, G.B.M.; Graler, B.; Pebesma, E.; Tadic, M.P.; Bajat, B.

    2014-01-01

    Combined Global Surface Summary of Day and European Climate Assessment and Dataset daily meteorological data sets (around 9000 stations) were used to build spatio-temporal geostatistical models and predict daily air temperature at ground resolution of 1km for the global land mass. Predictions in

  18. Multivariate analysis and geostatistics of the fertility of a humic rhodic hapludox under coffee cultivation

    Directory of Open Access Journals (Sweden)

    Samuel de Assis Silva

    2012-04-01

    Full Text Available The spatial variability of soil and plant properties exerts great influence on the yeld of agricultural crops. This study analyzed the spatial variability of the fertility of a Humic Rhodic Hapludox with Arabic coffee, using principal component analysis, cluster analysis and geostatistics in combination. The experiment was carried out in an area under Coffea arabica L., variety Catucai 20/15 - 479. The soil was sampled at a depth 0.20 m, at 50 points of a sampling grid. The following chemical properties were determined: P, K+, Ca2+, Mg2+, Na+, S, Al3+, pH, H + Al, SB, t, T, V, m, OM, Na saturation index (SSI, remaining phosphorus (P-rem, and micronutrients (Zn, Fe, Mn, Cu and B. The data were analyzed with descriptive statistics, followed by principal component and cluster analyses. Geostatistics were used to check and quantify the degree of spatial dependence of properties, represented by principal components. The principal component analysis allowed a dimensional reduction of the problem, providing interpretable components, with little information loss. Despite the characteristic information loss of principal component analysis, the combination of this technique with geostatistical analysis was efficient for the quantification and determination of the structure of spatial dependence of soil fertility. In general, the availability of soil mineral nutrients was low and the levels of acidity and exchangeable Al were high.

  19. Toward an estimation of daily european CO{sub 2} fluxes at high spatial resolution by inversion of atmospheric transport; Vers une estimation des flux de CO{sub 2} journaliers europeens a haute resolution par inversion du transport atmospherique

    Energy Technology Data Exchange (ETDEWEB)

    Carouge, C

    2006-04-15

    distance between pixels, climate and vegetation distribution over Europe. To study the potential of this method, we used synthetic data generated from forward simulations of LMDZt (driven by flux fields generated from the biosphere model ORCHIDEE). We have found that the current network is not dense enough to constrain fluxes at model resolution. However, fluxes that are aggregated spatially over a region of 850 x 850 km in the Western Europe and temporally over 8-10 days compare very well with the ORCHIDEE fluxes. Preliminary inversion results using real data indicate that synoptic variations of estimated fluxes are in phase with the variations of the ORCHIDEE biosphere model flux and the variations observed in atmospheric concentrations. However, the quality of the flux estimates are highly dependant on transport model errors and in particular, on the quality of modelling small scale transport. Moreover, fossil fuel emissions are prescribed in this inverse model and the quality of their distribution is shown to be crucial. Data selection also has a large impact on estimated fluxes. The use of the daytime only data to calculate daily averaged concentrations greatly improves the estimated fluxes by reducing bias inferred from model transport errors. (author)

  20. Estimation of red-light running frequency using high-resolution traffic and signal data.

    Science.gov (United States)

    Chen, Peng; Yu, Guizhen; Wu, Xinkai; Ren, Yilong; Li, Yueguang

    2017-05-01

    Red-light-running (RLR) emerges as a major cause that may lead to intersection-related crashes and endanger intersection safety. To reduce RLR violations, it's critical to identify the influential factors associated with RLR and estimate RLR frequency. Without resorting to video camera recordings, this study investigates this important issue by utilizing high-resolution traffic and signal event data collected from loop detectors at five intersections on Trunk Highway 55, Minneapolis, MN. First, a simple method is proposed to identify RLR by fully utilizing the information obtained from stop bar detectors, downstream entrance detectors and advance detectors. Using 12 months of event data, a total of 6550 RLR cases were identified. According to a definition of RLR frequency as the conditional probability of RLR on a certain traffic or signal condition (veh/1000veh), the relationships between RLR frequency and some influential factors including arriving time at advance detector, approaching speed, headway, gap to the preceding vehicle on adjacent lane, cycle length, geometric characteristics and even snowing weather were empirically investigated. Statistical analysis shows good agreement with the traffic engineering practice, e.g., RLR is most likely to occur on weekdays during peak periods under large traffic demands and longer signal cycles, and a total of 95.24% RLR events occurred within the first 1.5s after the onset of red phase. The findings confirmed that vehicles tend to run the red light when they are close to intersection during phase transition, and the vehicles following the leading vehicle with short headways also likely run the red light. Last, a simplified nonlinear regression model is proposed to estimate RLR frequency based on the data from advance detector. The study is expected to helpbetter understand RLR occurrence and further contribute to the future improvement of intersection safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Ore reserve estimation: a summary of principles and methods

    International Nuclear Information System (INIS)

    Marques, J.P.M.

    1985-01-01

    The mining industry has experienced substantial improvements with the increasing utilization of computerized and electronic devices throughout the last few years. In the ore reserve estimation field the main methods have undergone recent advances in order to improve their overall efficiency. This paper presents the three main groups of ore reserve estimation methods presently used worldwide: Conventional, Statistical and Geostatistical, and elaborates a detaited description and comparative analysis of each. The Conventional Methods are the oldest, less complex and most employed ones. The Geostatistical Methods are the most recent precise and more complex ones. The Statistical Methods are intermediate to the others in complexity, diffusion and chronological order. (D.J.M.) [pt

  2. Studies of high resolution array processing algorithms for multibeam bathymetric applications

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Schenke, H.W.

    In this paper a study is initiated to observe the usefulness of directional spectral estimation techniques for underwater bathymetric applications. High resolution techniques like the Maximum Likelihood (ML) method and the Maximum Entropy (ME...

  3. High Spatial Resolution Visual Band Imagery Outperforms Medium Resolution Spectral Imagery for Ecosystem Assessment in the Semi-Arid Brazilian Sertão

    Directory of Open Access Journals (Sweden)

    Ran Goldblatt

    2017-12-01

    Full Text Available Semi-arid ecosystems play a key role in global agricultural production, seasonal carbon cycle dynamics, and longer-run climate change. Because semi-arid landscapes are heterogeneous and often sparsely vegetated, repeated and large-scale ecosystem assessments of these regions have to date been impossible. Here, we assess the potential of high-spatial resolution visible band imagery for semi-arid ecosystem mapping. We use WorldView satellite imagery at 0.3–0.5 m resolution to develop a reference data set of nearly 10,000 labeled examples of three classes—trees, shrubs/grasses, and bare land—across 1000 km 2 of the semi-arid Sertão region of northeast Brazil. Using Google Earth Engine, we show that classification with low-spectral but high-spatial resolution input (WorldView outperforms classification with the full spectral information available from Landsat 30 m resolution imagery as input. Classification with high spatial resolution input improves detection of sparse vegetation and distinction between trees and seasonal shrubs and grasses, two features which are lost at coarser spatial (but higher spectral resolution input. Our total tree cover estimates for the study area disagree with recent estimates using other methods that may underestimate treecover because they confuse trees with seasonal vegetation (shrubs and grasses. This distinction is important for monitoring seasonal and long-run carbon cycle and ecosystem health. Our results suggest that newer remote sensing products that promise high frequency global coverage at high spatial but lower spectral resolution may offer new possibilities for direct monitoring of the world’s semi-arid ecosystems, and we provide methods that could be scaled to do so.

  4. Local Geostatistical Models and Big Data in Hydrological and Ecological Applications

    Science.gov (United States)

    Hristopulos, Dionissios

    2015-04-01

    The advent of the big data era creates new opportunities for environmental and ecological modelling but also presents significant challenges. The availability of remote sensing images and low-cost wireless sensor networks implies that spatiotemporal environmental data to cover larger spatial domains at higher spatial and temporal resolution for longer time windows. Handling such voluminous data presents several technical and scientific challenges. In particular, the geostatistical methods used to process spatiotemporal data need to overcome the dimensionality curse associated with the need to store and invert large covariance matrices. There are various mathematical approaches for addressing the dimensionality problem, including change of basis, dimensionality reduction, hierarchical schemes, and local approximations. We present a Stochastic Local Interaction (SLI) model that can be used to model local correlations in spatial data. SLI is a random field model suitable for data on discrete supports (i.e., regular lattices or irregular sampling grids). The degree of localization is determined by means of kernel functions and appropriate bandwidths. The strength of the correlations is determined by means of coefficients. In the "plain vanilla" version the parameter set involves scale and rigidity coefficients as well as a characteristic length. The latter determines in connection with the rigidity coefficient the correlation length of the random field. The SLI model is based on statistical field theory and extends previous research on Spartan spatial random fields [2,3] from continuum spaces to explicitly discrete supports. The SLI kernel functions employ adaptive bandwidths learned from the sampling spatial distribution [1]. The SLI precision matrix is expressed explicitly in terms of the model parameter and the kernel function. Hence, covariance matrix inversion is not necessary for parameter inference that is based on leave-one-out cross validation. This property

  5. Study of the permeability up-scaling by direct filtering of geostatistical model; Etude du changement d'echelle des permeabilites par filtrage direct du modele geostatistique

    Energy Technology Data Exchange (ETDEWEB)

    Zargar, G

    2005-10-15

    In this thesis, we present a new approach, which consists in directly up-scaling the geostatistical permeability distribution rather than the individual realizations. Practically, filtering techniques based on. the FFT (Fast Fourier Transform), allows us to generate geostatistical images, which sample the up-scaled distributions. In the log normal case, an equivalence hydraulic criterion is proposed, allowing to re-estimate the geometric mean of the permeabilities. In the anisotropic case, the effective geometric mean becomes a tensor which depends on the level of filtering used and it can be calculated by a method of renormalisation. Then, the method was generalized for the categorial model. Numerical tests of the method were set up for isotropic, anisotropic and categorial models, which shows good agreement with theory. (author)

  6. Do Red Edge and Texture Attributes from High-Resolution Satellite Data Improve Wood Volume Estimation in a Semi-Arid Mountainous Region?

    DEFF Research Database (Denmark)

    Schumacher, Paul; Mislimshoeva, Bunafsha; Brenning, Alexander

    2016-01-01

    to overcome this issue. However, clear recommendations on the suitability of specific proxies to provide accurate biomass information in semi-arid to arid environments are still lacking. This study contributes to the understanding of using multispectral high-resolution satellite data (RapidEye), specifically...... red edge and texture attributes, to estimate wood volume in semi-arid ecosystems characterized by scarce vegetation. LASSO (Least Absolute Shrinkage and Selection Operator) and random forest were used as predictive models relating in situ-measured aboveground standing wood volume to satellite data...

  7. Performance of a high resolution cavity beam position monitor system

    Science.gov (United States)

    Walston, Sean; Boogert, Stewart; Chung, Carl; Fitsos, Pete; Frisch, Joe; Gronberg, Jeff; Hayano, Hitoshi; Honda, Yosuke; Kolomensky, Yury; Lyapin, Alexey; Malton, Stephen; May, Justin; McCormick, Douglas; Meller, Robert; Miller, David; Orimoto, Toyoko; Ross, Marc; Slater, Mark; Smith, Steve; Smith, Tonee; Terunuma, Nobuhiro; Thomson, Mark; Urakawa, Junji; Vogel, Vladimir; Ward, David; White, Glen

    2007-07-01

    It has been estimated that an RF cavity Beam Position Monitor (BPM) could provide a position measurement resolution of less than 1 nm. We have developed a high resolution cavity BPM and associated electronics. A triplet comprised of these BPMs was installed in the extraction line of the Accelerator Test Facility (ATF) at the High Energy Accelerator Research Organization (KEK) for testing with its ultra-low emittance beam. The three BPMs were each rigidly mounted inside an alignment frame on six variable-length struts which could be used to move the BPMs in position and angle. We have developed novel methods for extracting the position and tilt information from the BPM signals including a robust calibration algorithm which is immune to beam jitter. To date, we have demonstrated a position resolution of 15.6 nm and a tilt resolution of 2.1 μrad over a dynamic range of approximately ±20 μm.

  8. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik

    2010-01-01

    (TPROGS) of alternating geological facies. The second method, multiple-point statistics, uses training images to estimate the conditional probability of sand-lenses at a certain location. Both methods respect field observations such as local stratigraphy, however, only the multiple-point statistics can...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...

  9. Simulation of high-resolution MFM tip using exchange-spring magnet

    Energy Technology Data Exchange (ETDEWEB)

    Saito, H. [Faculty of Resource Science and Engineering, Akita University, Akita 010-8502 (Japan)]. E-mail: hsaito@ipc.akita-u.ac.jp; Yatsuyanagi, D. [Faculty of Resource Science and Engineering, Akita University, Akita 010-8502 (Japan); Ishio, S. [Faculty of Resource Science and Engineering, Akita University, Akita 010-8502 (Japan); Ito, A. [Nitto Optical Co. Ltd., Misato, Akita 019-1403 (Japan); Kawamura, H. [Nitto Optical Co. Ltd., Misato, Akita 019-1403 (Japan); Ise, K. [Research Institute of Advanced Technology Akita, Akita 010-1623 (Japan); Taguchi, K. [Research Institute of Advanced Technology Akita, Akita 010-1623 (Japan); Takahashi, S. [Research Institute of Advanced Technology Akita, Akita 010-1623 (Japan)

    2007-03-15

    The transfer function of magnetic force microscope (MFM) tips using an exchange-spring trilayer composed of a centered soft magnetic layer and two hard magnetic layers was calculated and the resolution was estimated by considering the thermodynamic noise limit of an MFM cantilever. It was found that reducing the thickness of the centered soft magnetic layer and the magnetization of hard magnetic layer are important to obtain high resolution. Tips using an exchange-spring trilayer with a very thin FeCo layer and isotropic hard magnetic layers, such as CoPt and FePt, are found to be suitable for obtaining a resolution less than 10 nm at room temperature.

  10. An ML-Based Radial Velocity Estimation Algorithm for Moving Targets in Spaceborne High-Resolution and Wide-Swath SAR Systems

    Directory of Open Access Journals (Sweden)

    Tingting Jin

    2017-04-01

    Full Text Available Multichannel synthetic aperture radar (SAR is a significant breakthrough to the inherent limitation between high-resolution and wide-swath (HRWS compared with conventional SAR. Moving target indication (MTI is an important application of spaceborne HRWS SAR systems. In contrast to previous studies of SAR MTI, the HRWS SAR mainly faces the problem of under-sampled data of each channel, causing single-channel imaging and processing to be infeasible. In this study, the estimation of velocity is equivalent to the estimation of the cone angle according to their relationship. The maximum likelihood (ML based algorithm is proposed to estimate the radial velocity in the existence of Doppler ambiguities. After that, the signal reconstruction and compensation for the phase offset caused by radial velocity are processed for a moving target. Finally, the traditional imaging algorithm is applied to obtain a focused moving target image. Experiments are conducted to evaluate the accuracy and effectiveness of the estimator under different signal-to-noise ratios (SNR. Furthermore, the performance is analyzed with respect to the motion ship that experiences interference due to different distributions of sea clutter. The results verify that the proposed algorithm is accurate and efficient with low computational complexity. This paper aims at providing a solution to the velocity estimation problem in the future HRWS SAR systems with multiple receive channels.

  11. 4th European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Carrera, Jesus; Gómez-Hernández, José

    2004-01-01

    The fourth edition of the European Conference on Geostatistics for Environmental Applications (geoENV IV) took place in Barcelona, November 27-29, 2002. As a proof that there is an increasing interest in environmental issues in the geostatistical community, the conference attracted over 100 participants, mostly Europeans (up to 10 European countries were represented), but also from other countries in the world. Only 46 contributions, selected out of around 100 submitted papers, were invited to be presented orally during the conference. Additionally 30 authors were invited to present their work in poster format during a special session. All oral and poster contributors were invited to submit their work to be considered for publication in this Kluwer series. All papers underwent a reviewing process, which consisted on two reviewers for oral presentations and one reviewer for posters. The book opens with one keynote paper by Philippe Naveau. It is followed by 40 papers that correspond to those presented orally d...

  12. Robust Hydrological Forecasting for High-resolution Distributed Models Using a Unified Data Assimilation Approach

    Science.gov (United States)

    Hernandez, F.; Liang, X.

    2017-12-01

    Reliable real-time hydrological forecasting, to predict important phenomena such as floods, is invaluable to the society. However, modern high-resolution distributed models have faced challenges when dealing with uncertainties that are caused by the large number of parameters and initial state estimations involved. Therefore, to rely on these high-resolution models for critical real-time forecast applications, considerable improvements on the parameter and initial state estimation techniques must be made. In this work we present a unified data assimilation algorithm called Optimized PareTo Inverse Modeling through Inverse STochastic Search (OPTIMISTS) to deal with the challenge of having robust flood forecasting for high-resolution distributed models. This new algorithm combines the advantages of particle filters and variational methods in a unique way to overcome their individual weaknesses. The analysis of candidate particles compares model results with observations in a flexible time frame, and a multi-objective approach is proposed which attempts to simultaneously minimize differences with the observations and departures from the background states by using both Bayesian sampling and non-convex evolutionary optimization. Moreover, the resulting Pareto front is given a probabilistic interpretation through kernel density estimation to create a non-Gaussian distribution of the states. OPTIMISTS was tested on a low-resolution distributed land surface model using VIC (Variable Infiltration Capacity) and on a high-resolution distributed hydrological model using the DHSVM (Distributed Hydrology Soil Vegetation Model). In the tests streamflow observations are assimilated. OPTIMISTS was also compared with a traditional particle filter and a variational method. Results show that our method can reliably produce adequate forecasts and that it is able to outperform those resulting from assimilating the observations using a particle filter or an evolutionary 4D variational

  13. Estimating Gross Primary Production in Cropland with High Spatial and Temporal Scale Remote Sensing Data

    Science.gov (United States)

    Lin, S.; Li, J.; Liu, Q.

    2018-04-01

    Satellite remote sensing data provide spatially continuous and temporally repetitive observations of land surfaces, and they have become increasingly important for monitoring large region of vegetation photosynthetic dynamic. But remote sensing data have their limitation on spatial and temporal scale, for example, higher spatial resolution data as Landsat data have 30-m spatial resolution but 16 days revisit period, while high temporal scale data such as geostationary data have 30-minute imaging period, which has lower spatial resolution (> 1 km). The objective of this study is to investigate whether combining high spatial and temporal resolution remote sensing data can improve the gross primary production (GPP) estimation accuracy in cropland. For this analysis we used three years (from 2010 to 2012) Landsat based NDVI data, MOD13 vegetation index product and Geostationary Operational Environmental Satellite (GOES) geostationary data as input parameters to estimate GPP in a small region cropland of Nebraska, US. Then we validated the remote sensing based GPP with the in-situ measurement carbon flux data. Results showed that: 1) the overall correlation between GOES visible band and in-situ measurement photosynthesis active radiation (PAR) is about 50 % (R2 = 0.52) and the European Center for Medium-Range Weather Forecasts ERA-Interim reanalysis data can explain 64 % of PAR variance (R2 = 0.64); 2) estimating GPP with Landsat 30-m spatial resolution data and ERA daily meteorology data has the highest accuracy(R2 = 0.85, RMSE MODIS 1-km NDVI/EVI product import; 3) using daily meteorology data as input for GPP estimation in high spatial resolution data would have higher relevance than 8-day and 16-day input. Generally speaking, using the high spatial resolution and high frequency satellite based remote sensing data can improve GPP estimation accuracy in cropland.

  14. Study of the permeability up-scaling by direct filtering of geostatistical model; Etude du changement d'echelle des permeabilites par filtrage direct du modele geostatistique

    Energy Technology Data Exchange (ETDEWEB)

    Zargar, G.

    2005-10-15

    In this thesis, we present a new approach, which consists in directly up-scaling the geostatistical permeability distribution rather than the individual realizations. Practically, filtering techniques based on. the FFT (Fast Fourier Transform), allows us to generate geostatistical images, which sample the up-scaled distributions. In the log normal case, an equivalence hydraulic criterion is proposed, allowing to re-estimate the geometric mean of the permeabilities. In the anisotropic case, the effective geometric mean becomes a tensor which depends on the level of filtering used and it can be calculated by a method of renormalisation. Then, the method was generalized for the categorial model. Numerical tests of the method were set up for isotropic, anisotropic and categorial models, which shows good agreement with theory. (author)

  15. A New Hybrid Spatio-temporal Model for Estimating Daily Multi-year PM2.5 Concentrations Across Northeastern USA Using High Resolution Aerosol Optical Depth Data

    Science.gov (United States)

    Kloog, Itai; Chudnovsky, Alexandra A.; Just, Allan C.; Nordio, Francesco; Koutrakis, Petros; Coull, Brent A.; Lyapustin, Alexei; Wang, Yujie; Schwartz, Joel

    2014-01-01

    The use of satellite-based aerosol optical depth (AOD) to estimate fine particulate matter PM(sub 2.5) for epidemiology studies has increased substantially over the past few years. These recent studies often report moderate predictive power, which can generate downward bias in effect estimates. In addition, AOD measurements have only moderate spatial resolution, and have substantial missing data. We make use of recent advances in MODIS satellite data processing algorithms (Multi-Angle Implementation of Atmospheric Correction (MAIAC), which allow us to use 1 km (versus currently available 10 km) resolution AOD data.We developed and cross validated models to predict daily PM(sub 2.5) at a 1X 1 km resolution across the northeastern USA (New England, New York and New Jersey) for the years 2003-2011, allowing us to better differentiate daily and long term exposure between urban, suburban, and rural areas. Additionally, we developed an approach that allows us to generate daily high-resolution 200 m localized predictions representing deviations from the area 1 X 1 km grid predictions. We used mixed models regressing PM(sub 2.5) measurements against day-specific random intercepts, and fixed and random AOD and temperature slopes. We then use generalized additive mixed models with spatial smoothing to generate grid cell predictions when AOD was missing. Finally, to get 200 m localized predictions, we regressed the residuals from the final model for each monitor against the local spatial and temporal variables at each monitoring site. Our model performance was excellent (mean out-of-sample R(sup 2) = 0.88). The spatial and temporal components of the out-of-sample results also presented very good fits to the withheld data (R(sup 2) = 0.87, R(sup)2 = 0.87). In addition, our results revealed very little bias in the predicted concentrations (Slope of predictions versus withheld observations = 0.99). Our daily model results show high predictive accuracy at high spatial resolutions

  16. Added-values of high spatiotemporal remote sensing data in crop yield estimation

    Science.gov (United States)

    Gao, F.; Anderson, M. C.

    2017-12-01

    Timely and accurate estimation of crop yield before harvest is critical for food market and administrative planning. Remote sensing derived parameters have been used for estimating crop yield by using either empirical or crop growth models. The uses of remote sensing vegetation index (VI) in crop yield modeling have been typically evaluated at regional and country scales using coarse spatial resolution (a few hundred to kilo-meters) data or assessed over a small region at field level using moderate resolution spatial resolution data (10-100m). Both data sources have shown great potential in capturing spatial and temporal variability in crop yield. However, the added value of data with both high spatial and temporal resolution data has not been evaluated due to the lack of such data source with routine, global coverage. In recent years, more moderate resolution data have become freely available and data fusion approaches that combine data acquired from different spatial and temporal resolutions have been developed. These make the monitoring crop condition and estimating crop yield at field scale become possible. Here we investigate the added value of the high spatial and temporal VI for describing variability of crop yield. The explanatory ability of crop yield based on high spatial and temporal resolution remote sensing data was evaluated in a rain-fed agricultural area in the U.S. Corn Belt. Results show that the fused Landsat-MODIS (high spatial and temporal) VI explains yield variability better than single data source (Landsat or MODIS alone), with EVI2 performing slightly better than NDVI. The maximum VI describes yield variability better than cumulative VI. Even though VI is effective in explaining yield variability within season, the inter-annual variability is more complex and need additional information (e.g. weather, water use and management). Our findings augment the importance of high spatiotemporal remote sensing data and supports new moderate

  17. Indoor radon variations in central Iran and its geostatistical map

    Science.gov (United States)

    Hadad, Kamal; Mokhtari, Javad

    2015-02-01

    We present the results of 2 year indoor radon survey in 10 cities of Yazd province in Central Iran (covering an area of 80,000 km2). We used passive diffusive samplers with LATEX polycarbonate films as Solid State Nuclear Track Detector (SSNTD). This study carried out in central Iran where there are major minerals and uranium mines. Our results indicate that despite few extraordinary high concentrations, average annual concentrations of indoor radon are within ICRP guidelines. When geostatistical spatial distribution of radon mapped onto geographical features of the province it was observed that risk of high radon concentration increases near the Saqand, Bafq, Harat and Abarkooh cities, this depended on the elevation and vicinity of the ores and mines.

  18. Geostatistical analysis of prevailing groundwater conditions and potential solute migration at Elstow, Bedfordshire

    International Nuclear Information System (INIS)

    MacKay, R.; Cooper, T.A.; Porter, J.D.; O'Connell, P.E.; Metcalfe, A.V.

    1988-06-01

    A geostatistical approach is applied in a study of the potential migration of contaminants from a hypothetical waste disposal facility near Elstow, Bedfordshire. A deterministic numerical model of groundwater flow in the Kellaways Sands formation and adjacent layers is coupled with geostatistical simulation of the heterogeneous transmissivity field of this principal formation. A particle tracking technique is used to predict the migration pathways for alternative realisations of flow. Alternative statistical descriptions of the spatial structure of the transmissivity field are implemented and the temporal and spatial distributions of escape of contaminants to the biosphere are investigated. (author)

  19. Three-dimensional imaging of aquifer and aquitard heterogeneity via transient hydraulic tomography at a highly heterogeneous field site

    Science.gov (United States)

    Zhao, Zhanfeng; Illman, Walter A.

    2018-04-01

    Previous studies have shown that geostatistics-based transient hydraulic tomography (THT) is robust for subsurface heterogeneity characterization through the joint inverse modeling of multiple pumping tests. However, the hydraulic conductivity (K) and specific storage (Ss) estimates can be smooth or even erroneous for areas where pumping/observation densities are low. This renders the imaging of interlayer and intralayer heterogeneity of highly contrasting materials including their unit boundaries difficult. In this study, we further test the performance of THT by utilizing existing and newly collected pumping test data of longer durations that showed drawdown responses in both aquifer and aquitard units at a field site underlain by a highly heterogeneous glaciofluvial deposit. The robust performance of the THT is highlighted through the comparison of different degrees of model parameterization including: (1) the effective parameter approach; (2) the geological zonation approach relying on borehole logs; and (3) the geostatistical inversion approach considering different prior information (with/without geological data). Results reveal that the simultaneous analysis of eight pumping tests with the geostatistical inverse model yields the best results in terms of model calibration and validation. We also find that the joint interpretation of long-term drawdown data from aquifer and aquitard units is necessary in mapping their full heterogeneous patterns including intralayer variabilities. Moreover, as geological data are included as prior information in the geostatistics-based THT analysis, the estimated K values increasingly reflect the vertical distribution patterns of permeameter-estimated K in both aquifer and aquitard units. Finally, the comparison of various THT approaches reveals that differences in the estimated K and Ss tomograms result in significantly different transient drawdown predictions at observation ports.

  20. Near-field electromagnetic holography for high-resolution analysis of network interactions in neuronal tissue.

    Science.gov (United States)

    Kjeldsen, Henrik D; Kaiser, Marcus; Whittington, Miles A

    2015-09-30

    Brain function is dependent upon the concerted, dynamical interactions between a great many neurons distributed over many cortical subregions. Current methods of quantifying such interactions are limited by consideration only of single direct or indirect measures of a subsample of all neuronal population activity. Here we present a new derivation of the electromagnetic analogy to near-field acoustic holography allowing high-resolution, vectored estimates of interactions between sources of electromagnetic activity that significantly improves this situation. In vitro voltage potential recordings were used to estimate pseudo-electromagnetic energy flow vector fields, current and energy source densities and energy dissipation in reconstruction planes at depth into the neural tissue parallel to the recording plane of the microelectrode array. The properties of the reconstructed near-field estimate allowed both the utilization of super-resolution techniques to increase the imaging resolution beyond that of the microelectrode array, and facilitated a novel approach to estimating causal relationships between activity in neocortical subregions. The holographic nature of the reconstruction method allowed significantly better estimation of the fine spatiotemporal detail of neuronal population activity, compared with interpolation alone, beyond the spatial resolution of the electrode arrays used. Pseudo-energy flow vector mapping was possible with high temporal precision, allowing a near-realtime estimate of causal interaction dynamics. Basic near-field electromagnetic holography provides a powerful means to increase spatial resolution from electrode array data with careful choice of spatial filters and distance to reconstruction plane. More detailed approaches may provide the ability to volumetrically reconstruct activity patterns on neuronal tissue, but the ability to extract vectored data with the method presented already permits the study of dynamic causal interactions

  1. Absorbed dose estimates to structures of the brain and head using a high-resolution voxel-based head phantom

    International Nuclear Information System (INIS)

    Evans, Jeffrey F.; Blue, Thomas E.; Gupta, Nilendu

    2001-01-01

    The purpose of this article is to demonstrate the viability of using a high-resolution 3-D head phantom in Monte Carlo N-Particle (MCNP) for boron neutron capture therapy (BNCT) structure dosimetry. This work describes a high-resolution voxel-based model of a human head and its use for calculating absorbed doses to the structures of the brain. The Zubal head phantom is a 3-D model of a human head that can be displayed and manipulated on a computer. Several changes were made to the original head phantom which now contains over 29 critical structures of the brain and head. The modified phantom is a 85x109x120 lattice of voxels, where each voxel is 2.2x2.2x1.4 mm 3 . This model was translated into MCNP lattice format. As a proof of principle study, two MCNP absorbed dose calculations were made (left and right lateral irradiations) using a uniformly distributed neutron disk source with an 1/E energy spectrum. Additionally, the results of these two calculations were combined to estimate the absorbed doses from a bilateral irradiation. Radiobiologically equivalent (RBE) doses were calculated for all structures and were normalized to 12.8 Gy-Eq. For a left lateral irradiation, the left motor cortex receives the limiting RBE dose. For a bilateral irradiation, the insula cortices receive the limiting dose. Among the nonencephalic structures, the parotid glands receive RBE doses that were within 15% of the limiting dose

  2. Full-Coverage High-Resolution Daily PM(sub 2.5) Estimation using MAIAC AOD in the Yangtze River Delta of China

    Science.gov (United States)

    Xiao, Qingyang; Wang, Yujie; Chang, Howard H.; Meng, Xia; Geng, Guannan; Lyapustin, Alexei Ivanovich; Liu, Yang

    2017-01-01

    Satellite aerosol optical depth (AOD) has been used to assess population exposure to fine particulate matter (PM (sub 2.5)). The emerging high-resolution satellite aerosol product, Multi-Angle Implementation of Atmospheric Correction(MAIAC), provides a valuable opportunity to characterize local-scale PM(sub 2.5) at 1-km resolution. However, non-random missing AOD due to cloud snow cover or high surface reflectance makes this task challenging. Previous studies filled the data gap by spatially interpolating neighboring PM(sub 2.5) measurements or predictions. This strategy ignored the effect of cloud cover on aerosol loadings and has been shown to exhibit poor performance when monitoring stations are sparse or when there is seasonal large-scale missngness. Using the Yangtze River Delta of China as an example, we present a Multiple Imputation (MI) method that combines the MAIAC high-resolution satellite retrievals with chemical transport model (CTM) simulations to fill missing AOD. A two-stage statistical model driven by gap-filled AOD, meteorology and land use information was then fitted to estimate daily ground PM(sub 2.5) concentrations in 2013 and 2014 at 1 km resolution with complete coverage in space and time. The daily MI models have an average R(exp 2) of 0.77, with an inter-quartile range of 0.71 to 0.82 across days. The overall Ml model 10-fold cross-validation R(exp 2) (root mean square error) were 0.81 (25 gm(exp 3)) and 0.73 (18 gm(exp 3)) for year 2013 and 2014, respectively. Predictions with only observational AOD or only imputed AOD showed similar accuracy.Comparing with previous gap-filling methods, our MI method presented in this study performed bette rwith higher coverage, higher accuracy, and the ability to fill missing PM(sub 2.5) predictions without ground PM(sub 2.5) measurements. This method can provide reliable PM(sub 2.5)predictions with complete coverage that can reduce biasin exposure assessment in air pollution and health studies.

  3. High resolution solar observations

    International Nuclear Information System (INIS)

    Title, A.

    1985-01-01

    Currently there is a world-wide effort to develop optical technology required for large diffraction limited telescopes that must operate with high optical fluxes. These developments can be used to significantly improve high resolution solar telescopes both on the ground and in space. When looking at the problem of high resolution observations it is essential to keep in mind that a diffraction limited telescope is an interferometer. Even a 30 cm aperture telescope, which is small for high resolution observations, is a big interferometer. Meter class and above diffraction limited telescopes can be expected to be very unforgiving of inattention to details. Unfortunately, even when an earth based telescope has perfect optics there are still problems with the quality of its optical path. The optical path includes not only the interior of the telescope, but also the immediate interface between the telescope and the atmosphere, and finally the atmosphere itself

  4. High speed, High resolution terahertz spectrometers

    International Nuclear Information System (INIS)

    Kim, Youngchan; Yee, Dae Su; Yi, Miwoo; Ahn, Jaewook

    2008-01-01

    A variety of sources and methods have been developed for terahertz spectroscopy during almost two decades. Terahertz time domain spectroscopy (THz TDS)has attracted particular attention as a basic measurement method in the fields of THz science and technology. Recently, asynchronous optical sampling (AOS)THz TDS has been demonstrated, featuring rapid data acquisition and a high spectral resolution. Also, terahertz frequency comb spectroscopy (TFCS)possesses attractive features for high precision terahertz spectroscopy. In this presentation, we report on these two types of terahertz spectrometer. Our high speed, high resolution terahertz spectrometer is demonstrated using two mode locked femtosecond lasers with slightly different repetition frequencies without a mechanical delay stage. The repetition frequencies of the two femtosecond lasers are stabilized by use of two phase locked loops sharing the same reference oscillator. The time resolution of our terahertz spectrometer is measured using the cross correlation method to be 270 fs. AOS THz TDS is presented in Fig. 1, which shows a time domain waveform rapidly acquired on a 10ns time window. The inset shows a zoom into the signal with 100ps time window. The spectrum obtained by the fast Fourier Transformation (FFT)of the time domain waveform has a frequency resolution of 100MHz. The dependence of the signal to noise ratio (SNR)on the measurement time is also investigated

  5. Applicability of geostatistical methods and optimization of data for assessing hydraulic and geological conditions as a basis for remediation measures in the Ronneburg ore mining district

    International Nuclear Information System (INIS)

    Post, C.

    2001-01-01

    The remediation of the former Wismut mines in Thuringia has been planed and prepared since 1990. Objects of remediation are mines, tailing ponds and waste rock piles. Since more than 40 years of mining have had a great affect on the exploited aquifer, special emphasis is given to groundwater recharge so that minery-flooding is one of the conceivable remedial options. Controlled flooding supports minimising the expanded oxidation zone, which renders an immense pollutant potential, while at the same time the flooding reduces the quantity of acid mine water, that has to be treated. One of the main tasks of modelling the flooding progress is to determine and prognosticate the wateroutlet-places. Due to the inadequacy of the database from the production period, limited accuracy of the available data and because of the inherent uncertainty of approximations used in numerical modelling, a stochastic approach is prospected. The flooding predictions, i.e. modelling of hydrodynamical and hydrochemical conditions during and after completion of flooding predominantly depend on the spatial distribution of the hydraulic conductivity. In order to get a better understanding of the spatial heterogeneity of the Palaeozoic fractured rock aquifer, certain geostatistical interpolation methods are tested to achieve the best approach for describing the hydrogeological parameters in space. This work deals in detail with two selected geostatistical interpolation methods (ordinary and indicator kriging) and discusses their applicability and limitations including the application of the presented case. Another important target is the specification of the database and the improvement of consistency with statistical standards. The main emphasis lies on the spatial distribution of the measured hydraulic conductivity coefficient, its estimation at non-measured places and the influence of its spatial variability on modelling results. This topic is followed by the calculation of the estimation

  6. First measurements with new high-resolution gadolinium-GEM neutron detectors

    CERN Document Server

    Pfeiffer, Dorothea; Birch, Jens; Etxegarai, Maddi; Hall-Wilton, Richard; Höglund, Carina; Hultman, Lars; Llamas-Jansa, Isabel; Oliveri, Eraldo; Oksanen, Esko; Robinson, Linda; Ropelewski, Leszek; Schmidt, Susann; Streli, Christina; Thuiner, Patrik

    2016-05-17

    European Spallation Source instruments like the macromolecular diffractometer, NMX, require an excellent neutron detection efficiency, high-rate capabilities, time resolution, and an unprecedented spatial resolution in the order of a few hundred micrometers over a wide angular range of the incoming neutrons. For these instruments solid converters in combination with Micro Pattern Gaseous Detectors (MPGDs) are a promising option. A GEM detector with gadolinium converter was tested on a cold neutron beam at the IFE research reactor in Norway. The {\\mu}TPC analysis, proven to improve the spatial resolution in the case of $^{10}$B converters, is extended to gadolinium based detectors. For the first time, a Gd-GEM was successfully operated to detect neutrons with an estimated efficiency of 10% at a wavelength of 2 {\\AA} and a position resolution better than 350 {\\mu}m.

  7. High-Resolution Sonars: What Resolution Do We Need for Target Recognition?

    Directory of Open Access Journals (Sweden)

    Pailhas Yan

    2010-01-01

    Full Text Available Target recognition in sonar imagery has long been an active research area in the maritime domain, especially in the mine-counter measure context. Recently it has received even more attention as new sensors with increased resolution have been developed; new threats to critical maritime assets and a new paradigm for target recognition based on autonomous platforms have emerged. With the recent introduction of Synthetic Aperture Sonar systems and high-frequency sonars, sonar resolution has dramatically increased and noise levels decreased. Sonar images are distance images but at high resolution they tend to appear visually as optical images. Traditionally algorithms have been developed specifically for imaging sonars because of their limited resolution and high noise levels. With high-resolution sonars, algorithms developed in the image processing field for natural images become applicable. However, the lack of large datasets has hampered the development of such algorithms. Here we present a fast and realistic sonar simulator enabling development and evaluation of such algorithms.We develop a classifier and then analyse its performances using our simulated synthetic sonar images. Finally, we discuss sensor resolution requirements to achieve effective classification of various targets and demonstrate that with high resolution sonars target highlight analysis is the key for target recognition.

  8. Rainfall Erosivity Database on the European Scale (REDES): A product of a high temporal resolution rainfall data collection in Europe

    Science.gov (United States)

    Panagos, Panos; Ballabio, Cristiano; Borrelli, Pasquale; Meusburger, Katrin; Alewell, Christine

    2016-04-01

    The erosive force of rainfall is expressed as rainfall erosivity. Rainfall erosivity considers the rainfall amount and intensity, and is most commonly expressed as the R-factor in the (R)USLE model. The R-factor is calculated from a series of single storm events by multiplying the total storm kinetic energy with the measured maximum 30-minutes rainfall intensity. This estimation requests high temporal resolution (e.g. 30 minutes) rainfall data for sufficiently long time periods (i.e. 20 years) which are not readily available at European scale. The European Commission's Joint Research Centre(JRC) in collaboration with national/regional meteorological services and Environmental Institutions made an extensive data collection of high resolution rainfall data in the 28 Member States of the European Union plus Switzerland in order to estimate rainfall erosivity in Europe. This resulted in the Rainfall Erosivity Database on the European Scale (REDES) which included 1,541 rainfall stations in 2014 and has been updated with 134 additional stations in 2015. The interpolation of those point R-factor values with a Gaussian Process Regression (GPR) model has resulted in the first Rainfall Erosivity map of Europe (Science of the Total Environment, 511, 801-815). The intra-annual variability of rainfall erosivity is crucial for modelling soil erosion on a monthly and seasonal basis. The monthly feature of rainfall erosivity has been added in 2015 as an advancement of REDES and the respective mean annual R-factor map. Almost 19,000 monthly R-factor values of REDES contributed to the seasonal and monthly assessments of rainfall erosivity in Europe. According to the first results, more than 50% of the total rainfall erosivity in Europe takes place in the period from June to September. The spatial patterns of rainfall erosivity have significant differences between Northern and Southern Europe as summer is the most erosive period in Central and Northern Europe and autumn in the

  9. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains Central Facility

    Energy Technology Data Exchange (ETDEWEB)

    McFarlane, Sally A.; Gaustad, Krista L.; Mlawer, Eli J.; Long, Charles N.; Delamere, Jennifer

    2011-09-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, partial vegetation, non-vegetated) can be identified. A normalized difference vegetation index (NDVI) is used to distinguish between vegetated and non-vegetated surfaces, and a scaled NDVI index is used to estimate the percentage of green vegetation in partially vegetated surfaces. Based on libraries of spectral albedo measurements, a piecewise continuous function is developed to estimate the high spectral resolution surface albedo for each surface type given the MFR albedo values as input. For partially vegetated surfaces, the albedo is estimated as a linear combination of the green vegetation and non-vegetated surface albedo values. The estimated albedo values are evaluated through comparison to high spectral resolution albedo measurements taken during several Intensive Observational Periods (IOPs) and through comparison of the integrated spectral albedo values to observed broadband albedo measurements. The estimated spectral albedo values agree well with observations for the visible wavelengths constrained by the MFR measurements, but have larger biases and variability at longer wavelengths. Additional MFR channels at 1100 nm and/or 1600 nm would help constrain the high resolution spectral albedo in the near infrared region.

  10. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    Science.gov (United States)

    McFarlane, S. A.; Gaustad, K. L.; Mlawer, E. J.; Long, C. N.; Delamere, J.

    2011-09-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, partial vegetation, non-vegetated) can be identified. A normalized difference vegetation index (NDVI) is used to distinguish between vegetated and non-vegetated surfaces, and a scaled NDVI index is used to estimate the percentage of green vegetation in partially vegetated surfaces. Based on libraries of spectral albedo measurements, a piecewise continuous function is developed to estimate the high spectral resolution surface albedo for each surface type given the MFR albedo values as input. For partially vegetated surfaces, the albedo is estimated as a linear combination of the green vegetation and non-vegetated surface albedo values. The estimated albedo values are evaluated through comparison to high spectral resolution albedo measurements taken during several Intensive Observational Periods (IOPs) and through comparison of the integrated spectral albedo values to observed broadband albedo measurements. The estimated spectral albedo values agree well with observations for the visible wavelengths constrained by the MFR measurements, but have larger biases and variability at longer wavelengths. Additional MFR channels at 1100 nm and/or 1600 nm would help constrain the high resolution spectral albedo in the near infrared region.

  11. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    Directory of Open Access Journals (Sweden)

    J. Delamere

    2011-09-01

    Full Text Available We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM facility at the Southern Great Plains (SGP site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs, four different surface types (snow-covered, green vegetation, partial vegetation, non-vegetated can be identified. A normalized difference vegetation index (NDVI is used to distinguish between vegetated and non-vegetated surfaces, and a scaled NDVI index is used to estimate the percentage of green vegetation in partially vegetated surfaces. Based on libraries of spectral albedo measurements, a piecewise continuous function is developed to estimate the high spectral resolution surface albedo for each surface type given the MFR albedo values as input. For partially vegetated surfaces, the albedo is estimated as a linear combination of the green vegetation and non-vegetated surface albedo values. The estimated albedo values are evaluated through comparison to high spectral resolution albedo measurements taken during several Intensive Observational Periods (IOPs and through comparison of the integrated spectral albedo values to observed broadband albedo measurements. The estimated spectral albedo values agree well with observations for the visible wavelengths constrained by the MFR measurements, but have larger biases and variability at longer wavelengths. Additional MFR channels at 1100 nm and/or 1600 nm would help constrain the high resolution spectral albedo in the near infrared region.

  12. A high resolution global wind atlas - improving estimation of world wind resources

    DEFF Research Database (Denmark)

    Badger, Jake; Ejsing Jørgensen, Hans

    2011-01-01

    to population centres, electrical transmission grids, terrain types, and protected land areas are important parts of the resource assessment downstream of the generation of wind climate statistics. Related to these issues of integration are the temporal characteristics and spatial correlation of the wind...... resources. These aspects will also be addressed by the Global Wind Atlas. The Global Wind Atlas, through a transparent methodology, will provide a unified, high resolution, and public domain dataset of wind energy resources for the whole world. The wind atlas data will be the most appropriate wind resource...

  13. A high time and spatial resolution MRPC designed for muon tomography

    Science.gov (United States)

    Shi, L.; Wang, Y.; Huang, X.; Wang, X.; Zhu, W.; Li, Y.; Cheng, J.

    2014-12-01

    A prototype of cosmic muon scattering tomography system has been set up in Tsinghua University in Beijing. Multi-gap Resistive Plate Chamber (MRPC) is used in the system to get the muon tracks. Compared with other detectors, MRPC can not only provide the track but also the Time of Flight (ToF) between two detectors which can estimate the energy of particles. To get a more accurate track and higher efficiency of the tomography system, a new type of high time and two-dimensional spatial resolution MRPC has been developed. A series of experiments have been done to measure the efficiency, time resolution and spatial resolution. The results show that the efficiency can reach 95% and its time resolution is around 65 ps. The cluster size is around 4 and the spatial resolution can reach 200 μ m.

  14. ESTIMATING GROSS PRIMARY PRODUCTION IN CROPLAND WITH HIGH SPATIAL AND TEMPORAL SCALE REMOTE SENSING DATA

    Directory of Open Access Journals (Sweden)

    S. Lin

    2018-04-01

    Full Text Available Satellite remote sensing data provide spatially continuous and temporally repetitive observations of land surfaces, and they have become increasingly important for monitoring large region of vegetation photosynthetic dynamic. But remote sensing data have their limitation on spatial and temporal scale, for example, higher spatial resolution data as Landsat data have 30-m spatial resolution but 16 days revisit period, while high temporal scale data such as geostationary data have 30-minute imaging period, which has lower spatial resolution (> 1 km. The objective of this study is to investigate whether combining high spatial and temporal resolution remote sensing data can improve the gross primary production (GPP estimation accuracy in cropland. For this analysis we used three years (from 2010 to 2012 Landsat based NDVI data, MOD13 vegetation index product and Geostationary Operational Environmental Satellite (GOES geostationary data as input parameters to estimate GPP in a small region cropland of Nebraska, US. Then we validated the remote sensing based GPP with the in-situ measurement carbon flux data. Results showed that: 1 the overall correlation between GOES visible band and in-situ measurement photosynthesis active radiation (PAR is about 50 % (R2 = 0.52 and the European Center for Medium-Range Weather Forecasts ERA-Interim reanalysis data can explain 64 % of PAR variance (R2 = 0.64; 2 estimating GPP with Landsat 30-m spatial resolution data and ERA daily meteorology data has the highest accuracy(R2 = 0.85, RMSE < 3 gC/m2/day, which has better performance than using MODIS 1-km NDVI/EVI product import; 3 using daily meteorology data as input for GPP estimation in high spatial resolution data would have higher relevance than 8-day and 16-day input. Generally speaking, using the high spatial resolution and high frequency satellite based remote sensing data can improve GPP estimation accuracy in cropland.

  15. A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data

    KAUST Repository

    Liang, Faming

    2013-03-01

    The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  16. The structure of the ISM in the Zone of Avoidance by high-resolution multi-wavelength observations

    Science.gov (United States)

    Tóth, L. V.; Doi, Y.; Pinter, S.; Kovács, T.; Zahorecz, S.; Bagoly, Z.; Balázs, L. G.; Horvath, I.; Racz, I. I.; Onishi, T.

    2018-05-01

    We estimate the column density of the Galactic foreground interstellar medium (GFISM) in the direction of extragalactic sources. All-sky AKARI FIS infrared sky survey data might be used to trace the GFISM with a resolution of 2 arcminutes. The AKARI based GFISM hydrogen column density estimates are compared with similar quantities based on HI 21cm measurements of various resolution and of Planck results. High spatial resolution observations of the GFISM may be important recalculating the physical parameters of gamma-ray burst (GRB) host galaxies using the updated foreground parameters.

  17. A subspace approach to high-resolution spectroscopic imaging.

    Science.gov (United States)

    Lam, Fan; Liang, Zhi-Pei

    2014-04-01

    To accelerate spectroscopic imaging using sparse sampling of (k,t)-space and subspace (or low-rank) modeling to enable high-resolution metabolic imaging with good signal-to-noise ratio. The proposed method, called SPectroscopic Imaging by exploiting spatiospectral CorrElation, exploits a unique property known as partial separability of spectroscopic signals. This property indicates that high-dimensional spectroscopic signals reside in a very low-dimensional subspace and enables special data acquisition and image reconstruction strategies to be used to obtain high-resolution spatiospectral distributions with good signal-to-noise ratio. More specifically, a hybrid chemical shift imaging/echo-planar spectroscopic imaging pulse sequence is proposed for sparse sampling of (k,t)-space, and a low-rank model-based algorithm is proposed for subspace estimation and image reconstruction from sparse data with the capability to incorporate prior information and field inhomogeneity correction. The performance of the proposed method has been evaluated using both computer simulations and phantom studies, which produced very encouraging results. For two-dimensional spectroscopic imaging experiments on a metabolite phantom, a factor of 10 acceleration was achieved with a minimal loss in signal-to-noise ratio compared to the long chemical shift imaging experiments and with a significant gain in signal-to-noise ratio compared to the accelerated echo-planar spectroscopic imaging experiments. The proposed method, SPectroscopic Imaging by exploiting spatiospectral CorrElation, is able to significantly accelerate spectroscopic imaging experiments, making high-resolution metabolic imaging possible. Copyright © 2014 Wiley Periodicals, Inc.

  18. High Resolution Reconstruction of the Ionosphere for SAR Applications

    Science.gov (United States)

    Minkwitz, David; Gerzen, Tatjana; Hoque, Mainul

    2014-05-01

    Caused by ionosphere's strong impact on radio signal propagation, high resolution and highly accurate reconstructions of the ionosphere's electron density distribution are demanded for a large number of applications, e.g. to contribute to the mitigation of ionospheric effects on Synthetic Aperture Radar (SAR) measurements. As a new generation of remote sensing satellites the TanDEM-L radar mission is planned to improve the understanding and modelling ability of global environmental processes and ecosystem change. TanDEM-L will operate in L-band with a wavelength of approximately 24 cm enabling a stronger penetration capability compared to X-band (3 cm) or C-band (5 cm). But accompanied by the lower frequency of the TanDEM-L signals the influence of the ionosphere will increase. In particular small scale irregularities of the ionosphere might lead to electron density variations within the synthetic aperture length of the TanDEM-L satellite and in turn might result into blurring and azimuth pixel shifts. Hence the quality of the radar image worsens if the ionospheric effects are not mitigated. The Helmholtz Alliance project "Remote Sensing and Earth System Dynamics" (EDA) aims in the preparation of the HGF centres and the science community for the utilisation and integration of the TanDEM-L products into the study of the Earth's system. One significant point thereby is to cope with the mentioned ionospheric effects. Therefore different strategies towards achieving this objective are pursued: the mitigation of the ionospheric effects based on the radar data itself, the mitigation based on external information like global Total Electron Content (TEC) maps or reconstructions of the ionosphere and the combination of external information and radar data. In this presentation we describe the geostatistical approach chosen to analyse the behaviour of the ionosphere and to provide a high resolution 3D electron density reconstruction. As first step the horizontal structure of

  19. High-resolution noise substitution to measure overfitting and validate resolution in 3D structure determination by single particle electron cryomicroscopy

    International Nuclear Information System (INIS)

    Chen, Shaoxia; McMullan, Greg; Faruqi, Abdul R.; Murshudov, Garib N.; Short, Judith M.; Scheres, Sjors H.W.; Henderson, Richard

    2013-01-01

    Three-dimensional (3D) structure determination by single particle electron cryomicroscopy (cryoEM) involves the calculation of an initial 3D model, followed by extensive iterative improvement of the orientation determination of the individual particle images and the resulting 3D map. Because there is much more noise than signal at high resolution in the images, this creates the possibility of noise reinforcement in the 3D map, which can give a false impression of the resolution attained. The balance between signal and noise in the final map at its limiting resolution depends on the image processing procedure and is not easily predicted. There is a growing awareness in the cryoEM community of how to avoid such over-fitting and over-estimation of resolution. Equally, there has been a reluctance to use the two principal methods of avoidance because they give lower resolution estimates, which some people believe are too pessimistic. Here we describe a simple test that is compatible with any image processing protocol. The test allows measurement of the amount of signal and the amount of noise from overfitting that is present in the final 3D map. We have applied the method to two different sets of cryoEM images of the enzyme beta-galactosidase using several image processing packages. Our procedure involves substituting the Fourier components of the initial particle image stack beyond a chosen resolution by either the Fourier components from an adjacent area of background, or by simple randomisation of the phases of the particle structure factors. This substituted noise thus has the same spectral power distribution as the original data. Comparison of the Fourier Shell Correlation (FSC) plots from the 3D map obtained using the experimental data with that from the same data with high-resolution noise (HR-noise) substituted allows an unambiguous measurement of the amount of overfitting and an accompanying resolution assessment. A simple formula can be used to calculate an

  20. High-resolution noise substitution to measure overfitting and validate resolution in 3D structure determination by single particle electron cryomicroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Shaoxia; McMullan, Greg; Faruqi, Abdul R.; Murshudov, Garib N.; Short, Judith M.; Scheres, Sjors H.W.; Henderson, Richard, E-mail: rh15@mrc-lmb.cam.ac.uk

    2013-12-15

    Three-dimensional (3D) structure determination by single particle electron cryomicroscopy (cryoEM) involves the calculation of an initial 3D model, followed by extensive iterative improvement of the orientation determination of the individual particle images and the resulting 3D map. Because there is much more noise than signal at high resolution in the images, this creates the possibility of noise reinforcement in the 3D map, which can give a false impression of the resolution attained. The balance between signal and noise in the final map at its limiting resolution depends on the image processing procedure and is not easily predicted. There is a growing awareness in the cryoEM community of how to avoid such over-fitting and over-estimation of resolution. Equally, there has been a reluctance to use the two principal methods of avoidance because they give lower resolution estimates, which some people believe are too pessimistic. Here we describe a simple test that is compatible with any image processing protocol. The test allows measurement of the amount of signal and the amount of noise from overfitting that is present in the final 3D map. We have applied the method to two different sets of cryoEM images of the enzyme beta-galactosidase using several image processing packages. Our procedure involves substituting the Fourier components of the initial particle image stack beyond a chosen resolution by either the Fourier components from an adjacent area of background, or by simple randomisation of the phases of the particle structure factors. This substituted noise thus has the same spectral power distribution as the original data. Comparison of the Fourier Shell Correlation (FSC) plots from the 3D map obtained using the experimental data with that from the same data with high-resolution noise (HR-noise) substituted allows an unambiguous measurement of the amount of overfitting and an accompanying resolution assessment. A simple formula can be used to calculate an

  1. Effects of satellite image spatial aggregation and resolution on estimates of forest land area

    Science.gov (United States)

    M.D. Nelson; R.E. McRoberts; G.R. Holden; M.E. Bauer

    2009-01-01

    Satellite imagery is being used increasingly in association with national forest inventories (NFIs) to produce maps and enhance estimates of forest attributes. We simulated several image spatial resolutions within sparsely and heavily forested study areas to assess resolution effects on estimates of forest land area, independent of other sensor characteristics. We...

  2. A new omni-directional multi-camera system for high resolution surveillance

    Science.gov (United States)

    Cogal, Omer; Akin, Abdulkadir; Seyid, Kerem; Popovic, Vladan; Schmid, Alexandre; Ott, Beat; Wellig, Peter; Leblebici, Yusuf

    2014-05-01

    Omni-directional high resolution surveillance has a wide application range in defense and security fields. Early systems used for this purpose are based on parabolic mirror or fisheye lens where distortion due to the nature of the optical elements cannot be avoided. Moreover, in such systems, the image resolution is limited to a single image sensor's image resolution. Recently, the Panoptic camera approach that mimics the eyes of flying insects using multiple imagers has been presented. This approach features a novel solution for constructing a spherically arranged wide FOV plenoptic imaging system where the omni-directional image quality is limited by low-end sensors. In this paper, an overview of current Panoptic camera designs is provided. New results for a very-high resolution visible spectrum imaging and recording system inspired from the Panoptic approach are presented. The GigaEye-1 system, with 44 single cameras and 22 FPGAs, is capable of recording omni-directional video in a 360°×100° FOV at 9.5 fps with a resolution over (17,700×4,650) pixels (82.3MP). Real-time video capturing capability is also verified at 30 fps for a resolution over (9,000×2,400) pixels (21.6MP). The next generation system with significantly higher resolution and real-time processing capacity, called GigaEye-2, is currently under development. The important capacity of GigaEye-1 opens the door to various post-processing techniques in surveillance domain such as large perimeter object tracking, very-high resolution depth map estimation and high dynamicrange imaging which are beyond standard stitching and panorama generation methods.

  3. Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution

    International Nuclear Information System (INIS)

    Souza, Luis Eduardo de; Costa, Joao Felipe C. L.; Koppe, Jair C.

    2004-01-01

    For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit

  4. Development Of High-Resolution Mechanical Spectroscopy, HRMS: Status And Perspectives. HRMS Coupled With A Laser Dilatometer

    Directory of Open Access Journals (Sweden)

    Magalas L.B.

    2015-09-01

    Full Text Available Recent achievements in the development of low-frequency high-resolution mechanical spectroscopy (HRMS are briefly reported. It is demonstrated that extremely low values of the loss angle, ϕ, (tanϕb = 1×10−5 can be measured as a function of frequency, and the precision in estimation of the dynamic modulus is better than 1×10−5 in arbitrary units. Three conditions must be fulfilled to obtain high resolution in subresonant and resonant mechanical loss measurements: (1 noise in stress and elastic strain signals must be lower than 70 dB, (2 high quality of stress and strain signals must be tested both in the frequency- and time-domains, and (3 the estimation of the mechanical loss and modulus must be verified by at least two different computing methods operating in the frequency- and time-domains. It is concluded that phase measurements in the subresonant domain are no longer determined by precision in estimation of the loss angle. Recent developments in high-resolution resonant mechanical loss measurements stem from the application of advanced nonparametric and parametric computing methods and algorithms to estimate the logarithmic decrement and the elastic modulus from exponentially damped free decaying oscillations embedded in experimental noise.

  5. Projections onto Convex Sets Super-Resolution Reconstruction Based on Point Spread Function Estimation of Low-Resolution Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Chong Fan

    2017-02-01

    Full Text Available To solve the problem on inaccuracy when estimating the point spread function (PSF of the ideal original image in traditional projection onto convex set (POCS super-resolution (SR reconstruction, this paper presents an improved POCS SR algorithm based on PSF estimation of low-resolution (LR remote sensing images. The proposed algorithm can improve the spatial resolution of the image and benefit agricultural crop visual interpolation. The PSF of the highresolution (HR image is unknown in reality. Therefore, analysis of the relationship between the PSF of the HR image and the PSF of the LR image is important to estimate the PSF of the HR image by using multiple LR images. In this study, the linear relationship between the PSFs of the HR and LR images can be proven. In addition, the novel slant knife-edge method is employed, which can improve the accuracy of the PSF estimation of LR images. Finally, the proposed method is applied to reconstruct airborne digital sensor 40 (ADS40 three-line array images and the overlapped areas of two adjacent GF-2 images by embedding the estimated PSF of the HR image to the original POCS SR algorithm. Experimental results show that the proposed method yields higher quality of reconstructed images than that produced by the blind SR method and the bicubic interpolation method.

  6. Berkeley High-Resolution Ball

    International Nuclear Information System (INIS)

    Diamond, R.M.

    1984-10-01

    Criteria for a high-resolution γ-ray system are discussed. Desirable properties are high resolution, good response function, and moderate solid angle so as to achieve not only double- but triple-coincidences with good statistics. The Berkeley High-Resolution Ball involved the first use of bismuth germanate (BGO) for anti-Compton shield for Ge detectors. The resulting compact shield permitted rather close packing of 21 detectors around a target. In addition, a small central BGO ball gives the total γ-ray energy and multiplicity, as well as the angular pattern of the γ rays. The 21-detector array is nearly complete, and the central ball has been designed, but not yet constructed. First results taken with 9 detector modules are shown for the nucleus 156 Er. The complex decay scheme indicates a transition from collective rotation (prolate shape) to single- particle states (possibly oblate) near spin 30 h, and has other interesting features

  7. Overview and technical and practical aspects for use of geostatistics in hazardous-, toxic-, and radioactive-waste-site investigations

    International Nuclear Information System (INIS)

    Bossong, C.R.; Karlinger, M.R.; Troutman, B.M.; Vecchia, A.V.

    1999-01-01

    Technical and practical aspects of applying geostatistics are developed for individuals involved in investigation at hazardous-, toxic-, and radioactive-waste sites. Important geostatistical concepts, such as variograms and ordinary, universal, and indicator kriging, are described in general terms for introductory purposes and in more detail for practical applications. Variogram modeling using measured ground-water elevation data is described in detail to illustrate principles of stationarity, anisotropy, transformations, and cross validation. Several examples of kriging applications are described using ground-water-level elevations, bedrock elevations, and ground-water-quality data. A review of contemporary literature and selected public domain software associated with geostatistics also is provided, as is a discussion of alternative methods for spatial modeling, including inverse distance weighting, triangulation, splines, trend-surface analysis, and simulation

  8. Sparse Representation Denoising for Radar High Resolution Range Profiling

    Directory of Open Access Journals (Sweden)

    Min Li

    2014-01-01

    Full Text Available Radar high resolution range profile has attracted considerable attention in radar automatic target recognition. In practice, radar return is usually contaminated by noise, which results in profile distortion and recognition performance degradation. To deal with this problem, in this paper, a novel denoising method based on sparse representation is proposed to remove the Gaussian white additive noise. The return is sparsely described in the Fourier redundant dictionary and the denoising problem is described as a sparse representation model. Noise level of the return, which is crucial to the denoising performance but often unknown, is estimated by performing subspace method on the sliding subsequence correlation matrix. Sliding window process enables noise level estimation using only one observation sequence, not only guaranteeing estimation efficiency but also avoiding the influence of profile time-shift sensitivity. Experimental results show that the proposed method can effectively improve the signal-to-noise ratio of the return, leading to a high-quality profile.

  9. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    Science.gov (United States)

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.

  10. Geographical distribution of the annual mean radon concentrations in primary schools of Southern Serbia – application of geostatistical methods

    International Nuclear Information System (INIS)

    Bossew, P.; Žunić, Z.S.; Stojanovska, Z.; Tollefsen, T.; Carpentieri, C.; Veselinović, N.; Komatina, S.; Vaupotič, J.; Simović, R.D.; Antignani, S.; Bochicchio, F.

    2014-01-01

    Between 2008 and 2011 a survey of radon ( 222 Rn) was performed in schools of several districts of Southern Serbia. Some results have been published previously (Žunić et al., 2010; Carpentieri et al., 2011; Žunić et al., 2013). This article concentrates on the geographical distribution of the measured Rn concentrations. Applying geostatistical methods we generate “school radon maps” of expected concentrations and of estimated probabilities that a concentration threshold is exceeded. The resulting maps show a clearly structured spatial pattern which appears related to the geological background. In particular in areas with vulcanite and granitoid rocks, elevated radon (Rn) concentrations can be expected. The “school radon map” can therefore be considered as proxy to a map of the geogenic radon potential, and allows identification of radon-prone areas, i.e. areas in which higher Rn radon concentrations can be expected for natural reasons. It must be stressed that the “radon hazard”, or potential risk, estimated this way, has to be distinguished from the actual radon risk, which is a function of exposure. This in turn may require (depending on the target variable which is supposed to measure risk) considering demographic and sociological reality, i.e. population density, distribution of building styles and living habits. -- Highlights: • A map of Rn concentrations in primary schools of Southern Serbia. • Application of geostatistical methods. • Correlation with geology found. • Can serve as proxy to identify radon prone areas

  11. Comparing the applicability of some geostatistical methods to predict the spatial distribution of topsoil Calcium Carbonate in part of farmland of Zanjan Province

    Science.gov (United States)

    Sarmadian, Fereydoon; Keshavarzi, Ali

    2010-05-01

    Most of soils in iran, were located in the arid and semi-arid regions and have high pH (more than 7) and high amount of calcium carbonate and this problem cause to their calcification.In calcareous soils, plant growing and production is difficult. Most part of this problem, in relation to high pH and high concentration of calcium ion that cause to fixation and unavailability of elements which were dependent to pH, especially Phosphorous and some micro nutrients such as Fe, Zn, Mn and Cu. Prediction of soil calcium carbonate in non-sampled areas and mapping the calcium carbonate variability in order to sustainable management of soil fertility is very important.So, this research was done with the aim of evaluation and analyzing spatial variability of topsoil calcium carbonate as an aspect of soil fertility and plant nutrition, comparing geostatistical methods such as kriging and co-kriging and mapping topsoil calcium carbonate. For geostatistical analyzing, sampling was done with stratified random method and soil samples from 0 to 15 cm depth were collected with auger within 23 locations.In co-kriging method, salinity data was used as auxiliary variable. For comparing and evaluation of geostatistical methods, cross validation were used by statistical parameters of RMSE. The results showed that co-kriging method has the highest correlation coefficient and less RMSE and has the higher accuracy than kriging method to prediction of calcium carbonate content in non-sampled areas.

  12. Enhancing time resolution by stabilized inverse filter and Q estimated on instantaneous spectra

    OpenAIRE

    Corrales, Álvaro; Cabrera, Francisco; Montes, Luis

    2014-01-01

    Physical phenomena, such as attenuation of high frequency components and velocity dispersion, deteriorate seismic images. To enhance seismic resolution, Q filtering is usually applied, where the accurate estimation of Q is the core of this approach. The Matching Pursuit (MP) approach is an instantaneous spectral analysis method that overcomes windowing problems caused by decomposing a seismic trace, providing a frequency spectrum for each time sample of the trace. By changing variables, the s...

  13. High-resolution space-time characterization of convective rain cells: implications on spatial aggregation and temporal sampling operated by coarser resolution instruments

    Science.gov (United States)

    Marra, Francesco; Morin, Efrat

    2017-04-01

    Forecasting the occurrence of flash floods and debris flows is fundamental to save lives and protect infrastructures and properties. These natural hazards are generated by high-intensity convective storms, on space-time scales that cannot be properly monitored by conventional instrumentation. Consequently, a number of early-warning systems are nowadays based on remote sensing precipitation observations, e.g. from weather radars or satellites, that proved effective in a wide range of situations. However, the uncertainty affecting rainfall estimates represents an important issue undermining the operational use of early-warning systems. The uncertainty related to remote sensing estimates results from (a) an instrumental component, intrinsic of the measurement operation, and (b) a discretization component, caused by the discretization of the continuous rainfall process. Improved understanding on these sources of uncertainty will provide crucial information to modelers and decision makers. This study aims at advancing knowledge on the (b) discretization component. To do so, we take advantage of an extremely-high resolution X-Band weather radar (60 m, 1 min) recently installed in the Eastern Mediterranean. The instrument monitors a semiarid to arid transition area also covered by an accurate C-Band weather radar and by a relatively sparse rain gauge network ( 1 gauge/ 450 km2). Radar quantitative precipitation estimation includes corrections reducing the errors due to ground echoes, orographic beam blockage and attenuation of the signal in heavy rain. Intense, convection-rich, flooding events recently occurred in the area serve as study cases. We (i) describe with very high detail the spatiotemporal characteristics of the convective cores, and (ii) quantify the uncertainty due to spatial aggregation (spatial discretization) and temporal sampling (temporal discretization) operated by coarser resolution remote sensing instruments. We show that instantaneous rain intensity

  14. Geostatistical analysis and kriging of Hexachlorocyclohexane residues in topsoil from Tianjin, China

    International Nuclear Information System (INIS)

    Li, B.G.; Cao, J.; Liu, W.X.; Shen, W.R.; Wang, X.J.; Tao, S.

    2006-01-01

    A previously published data set of HCH isomer concentrations in topsoil samples from Tianjin, China, was subjected to geospatial analysis. Semivariograms were calculated and modeled using geostatistical techniques. Parameters of semivariogram models were analyzed and compared for four HCH isomers. Two-dimensional ordinary block kriging was applied to HCH isomers data set for mapping purposes. Dot maps and gray-scaled raster maps of HCH concentrations were presented based on kriging results. The appropriateness of the kriging procedure for mapping purposes was evaluated based on the kriging errors and kriging variances. It was found that ordinary block kriging can be applied to interpolate HCH concentrations in Tianjin topsoil with acceptable accuracy for mapping purposes. - Geostatistical analysis and kriging were applied to HCH concentrations in topsoil of Tianjin, China for mapping purposes

  15. APPLICATION OF CONVOLUTIONAL NEURAL NETWORK IN CLASSIFICATION OF HIGH RESOLUTION AGRICULTURAL REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    C. Yao

    2017-09-01

    Full Text Available With the rapid development of Precision Agriculture (PA promoted by high-resolution remote sensing, it makes significant sense in management and estimation of agriculture through crop classification of high-resolution remote sensing image. Due to the complex and fragmentation of the features and the surroundings in the circumstance of high-resolution, the accuracy of the traditional classification methods has not been able to meet the standard of agricultural problems. In this case, this paper proposed a classification method for high-resolution agricultural remote sensing images based on convolution neural networks(CNN. For training, a large number of training samples were produced by panchromatic images of GF-1 high-resolution satellite of China. In the experiment, through training and testing on the CNN under the toolbox of deep learning by MATLAB, the crop classification finally got the correct rate of 99.66 % after the gradual optimization of adjusting parameter during training. Through improving the accuracy of image classification and image recognition, the applications of CNN provide a reference value for the field of remote sensing in PA.

  16. Lattice Boltzmann Simulations of Fluid Flow in Continental Carbonate Reservoir Rocks and in Upscaled Rock Models Generated with Multiple-Point Geostatistics

    Directory of Open Access Journals (Sweden)

    J. Soete

    2017-01-01

    Full Text Available Microcomputed tomography (μCT and Lattice Boltzmann Method (LBM simulations were applied to continental carbonates to quantify fluid flow. Fluid flow characteristics in these complex carbonates with multiscale pore networks are unique and the applied method allows studying their heterogeneity and anisotropy. 3D pore network models were introduced to single-phase flow simulations in Palabos, a software tool for particle-based modelling of classic computational fluid dynamics. In addition, permeability simulations were also performed on rock models generated with multiple-point geostatistics (MPS. This allowed assessing the applicability of MPS in upscaling high-resolution porosity patterns into large rock models that exceed the volume limitations of the μCT. Porosity and tortuosity control fluid flow in these porous media. Micro- and mesopores influence flow properties at larger scales in continental carbonates. Upscaling with MPS is therefore necessary to overcome volume-resolution problems of CT scanning equipment. The presented LBM-MPS workflow is applicable to other lithologies, comprising different pore types, shapes, and pore networks altogether. The lack of straightforward porosity-permeability relationships in complex carbonates highlights the necessity for a 3D approach. 3D fluid flow studies provide the best understanding of flow through porous media, which is of crucial importance in reservoir modelling.

  17. Resolving mass flux at high spatial and temporal resolution using GRACE intersatellite measurements

    DEFF Research Database (Denmark)

    Rowlands, D. D.; Luthcke, S. B.; Klosko, S. M.

    2005-01-01

    resolution. Using 4° × 4° blocks at 10-day intervals, we estimate the mass of surplus or deficit water over a 52° × 60° grid centered on the Amazon basin for July 2003. We demonstrate that the recovered signals are coherent and correlate well with the expected hydrological signal....... the estimation of static monthly parameters. Through an analysis of the GRACE data residuals, we show that the fundamental temporal and spatial resolution of the GRACE data is 10 days and 400 km. We present an approach similar in concept to altimetric methods that recovers submonthly mass flux at a high spatial...

  18. Characterization of a deep radiological contamination: integration of geostatistical processing and historical data - 59062

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; De Moura, Patrick

    2012-01-01

    The problem of site characterization is quite complex, especially for deep radiological contamination. This article illustrates the added value of the geo-statistical processing on a real application case dealing with grounds of facilities partially dismantled at the end of the 1950's in Fontenay-aux-Roses CEA Center (France). 12 years ago, a first exploratory drill-hole confirmed the presence of a deep radiological contamination (more than 4 m deep). More recently, 8 additional drill-holes failed to delineate the contamination extension. The integration of the former topography and other geological data led to the realization of 10 additional drill holes. This final stage significantly improved the characterization of the radiological contamination, which impacted the remediation project and the initially estimated volumes. (authors)

  19. Seasonal monitoring and estimation of regional aerosol distribution over Po valley, northern Italy, using a high-resolution MAIAC product

    Science.gov (United States)

    Arvani, Barbara; Pierce, R. Bradley; Lyapustin, Alexei I.; Wang, Yujie; Ghermandi, Grazia; Teggi, Sergio

    2016-09-01

    In this work, the new 1 km-resolved Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm is employed to characterize seasonal PM10 - AOD correlations over northern Italy. The accuracy of the new dataset is assessed compared to the widely used Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5.1 Aerosol Optical Depth (AOD) data, retrieved at 0.55 μm with spatial resolution of 10 km (MYD04_L2). We focused on evaluating the ability of these two products to characterize both temporal and spatial distributions of aerosols within urban and suburban areas. Ground PM10 measurements were obtained from 73 of the Italian Regional Agency for Environmental Protection (ARPA) monitoring stations, spread across northern Italy, during a three-year period from 2010 to 2012. The Po Valley area (northern Italy) was chosen as the study domain because of its severe urban air pollution, resulting from it having the highest population and industrial manufacturing density in the country, being located in a valley where two surrounding mountain chains favor the stagnation of pollutants. We found that the global correlations between the bin-averaged PM10 and AOD are R2 = 0.83 and R2 = 0.44 for MYD04_L2 and for MAIAC, respectively, suggesting a greater sensitivity of the high-resolution product to small-scale deviations. However, the introduction of Relative Humidity (RH) and Planetary Boundary Layer (PBL) depth corrections allowed for a significant improvement to the bin-averaged PM - AOD correlation, which led to a similar performance: R2 = 0.96 for MODIS and R2 = 0.95 for MAIAC. Furthermore, the introduction of the PBL information in the corrected AOD values was found to be crucial in order to capture the clear seasonal cycle shown by measured PM10 values. The study allowed us to define four seasonal linear correlations that estimate PM10 concentrations satisfactorily from the remotely sensed MAIAC AOD retrieval. Overall, the results show that the high

  20. Effect of CT image size and resolution on the accuracy of rock property estimates

    Science.gov (United States)

    Bazaikin, Y.; Gurevich, B.; Iglauer, S.; Khachkova, T.; Kolyukhin, D.; Lebedev, M.; Lisitsa, V.; Reshetova, G.

    2017-05-01

    In order to study the effect of the micro-CT scan resolution and size on the accuracy of upscaled digital rock property estimation of core samples Bentheimer sandstone images with the resolution varying from 0.9 μm to 24 μm are used. We statistically show that the correlation length of the pore-to-matrix distribution can be reliably determined for the images with the resolution finer than 9 voxels per correlation length and the representative volume for this property is about 153 correlation length. Similar resolution values for the statistically representative volume are also valid for the estimation of the total porosity, specific surface area, mean curvature, and topology of the pore space. Only the total porosity and the number of isolated pores are stably recovered, whereas geometry and the topological measures of the pore space are strongly affected by the resolution change. We also simulate fluid flow in the pore space and estimate permeability and tortuosity of the sample. The results demonstrate that the representative volume for the transport property calculation should be greater than 50 correlation lengths of pore-to-matrix distribution. On the other hand, permeability estimation based on the statistical analysis of equivalent realizations shows some weak influence of the resolution on the transport properties. The reason for this might be that the characteristic scale of the particular physical processes may affect the result stronger than the model (image) scale.

  1. Geostatistical analysis of the flood risk perception queries in the village of Navaluenga (Central Spain)

    Science.gov (United States)

    Guardiola-Albert, Carolina; Díez-Herrero, Andrés; Amérigo, María; García, Juan Antonio; María Bodoque, José; Fernández-Naranjo, Nuria

    2017-04-01

    Flash floods provoke a high average mortality as they are usually unexpected events which evolve rapidly and affect relatively small areas. The short time available for minimizing risks requires preparedness and response actions to be put into practice. Therefore, it is necessary the development of emergency response plans to evacuate and rescue people in the context of a flash-flood hazard. In this framework, risk management has to integrate the social dimension of flash-flooding and its spatial distribution by understanding the characteristics of local communities in order to enhance community resilience during a flash-flood. In this regard, the flash-flood social risk perception of the village of Navaluenga (Central Spain) has been recently assessed, as well as the level of awareness of civil protection and emergency management strategies (Bodoque et al., 2016). This has been done interviewing 254 adults, representing roughly 12% of the population census. The present study wants to go further in the analysis of the resulting questionnaires, incorporating in the analysis the location of home spatial coordinates in order to characterize the spatial distribution and possible geographical interpretation of flood risk perception. We apply geostatistical methods to analyze spatial relations of social risk perception and level of awareness with distance to the rivers (Alberche and Chorrerón) or to the flood-prone areas (50-year, 100-year and 500-year flood plains). We want to discover spatial patterns, if any, using correlation functions (variograms). Geostatistical analyses results can help to either confirm the logical pattern (i.e., less awareness further to the rivers or high return period of flooding) or reveal departures from expected. It can also be possible to identify hot spots, cold spots, and spatial outliers. The interpretation of these spatial patterns can give valuable information to define strategies to improve the awareness regarding preparedness and

  2. High-Resolution PET Detector. Final report

    International Nuclear Information System (INIS)

    Karp, Joel

    2014-01-01

    The objective of this project was to develop an understanding of the limits of performance for a high resolution PET detector using an approach based on continuous scintillation crystals rather than pixelated crystals. The overall goal was to design a high-resolution detector, which requires both high spatial resolution and high sensitivity for 511 keV gammas. Continuous scintillation detectors (Anger cameras) have been used extensively for both single-photon and PET scanners, however, these instruments were based on NaI(Tl) scintillators using relatively large, individual photo-multipliers. In this project we investigated the potential of this type of detector technology to achieve higher spatial resolution through the use of improved scintillator materials and photo-sensors, and modification of the detector surface to optimize the light response function.We achieved an average spatial resolution of 3-mm for a 25-mm thick, LYSO continuous detector using a maximum likelihood position algorithm and shallow slots cut into the entrance surface

  3. Machine vision-based high-resolution weed mapping and patch-sprayer performance simulation

    NARCIS (Netherlands)

    Tang, L.; Tian, L.F.; Steward, B.L.

    1999-01-01

    An experimental machine vision-based patch-sprayer was developed. This sprayer was primarily designed to do real-time weed density estimation and variable herbicide application rate control. However, the sprayer also had the capability to do high-resolution weed mapping if proper mapping techniques

  4. Peculiar velocity effects in high-resolution microwave background experiments

    International Nuclear Information System (INIS)

    Challinor, Anthony; Leeuwen, Floor van

    2002-01-01

    We investigate the impact of peculiar velocity effects due to the motion of the solar system relative to the cosmic microwave background (CMB) on high resolution CMB experiments. It is well known that on the largest angular scales the combined effects of Doppler shifts and aberration are important; the lowest Legendre multipoles of total intensity receive power from the large CMB monopole in transforming from the CMB frame. On small angular scales aberration dominates and is shown here to lead to significant distortions of the total intensity and polarization multipoles in transforming from the rest frame of the CMB to the frame of the solar system. We provide convenient analytic results for the distortions as series expansions in the relative velocity of the two frames, but at the highest resolutions a numerical quadrature is required. Although many of the high resolution multipoles themselves are severely distorted by the frame transformations, we show that their statistical properties distort by only an insignificant amount. Therefore, the cosmological parameter estimation is insensitive to the transformation from the CMB frame (where theoretical predictions are calculated) to the rest frame of the experiment

  5. High-heat tank safety issue resolution program plan

    International Nuclear Information System (INIS)

    Wang, O.S.

    1993-12-01

    The purpose of this program plan is to provide a guide for selecting corrective actions that will mitigate and/or remediate the high-heat waste tank safety issue for single-shell tank (SST) 241-C-106. This program plan also outlines the logic for selecting approaches and tasks to mitigate and resolve the high-heat safety issue. The identified safety issue for high-heat tank 241-C-106 involves the potential release of nuclear waste to the environment as the result of heat-induced structural damage to the tank's concrete, if forced cooling is interrupted for extended periods. Currently, forced ventilation with added water to promote thermal conductivity and evaporation cooling is used to cool the waste. At this time, the only viable solution identified to resolve this safety issue is the removal of heat generating waste in the tank. This solution is being aggressively pursued as the permanent solution to this safety issue and also to support the present waste retrieval plan. Tank 241-C-106 has been selected as the first SST for retrieval. The program plan has three parts. The first part establishes program objectives and defines safety issues, drivers, and resolution criteria and strategy. The second part evaluates the high-heat safety issue and its mitigation and remediation methods and alternatives according to resolution logic. The third part identifies major tasks and alternatives for mitigation and resolution of the safety issue. Selected tasks and best-estimate schedules are also summarized in the program plan

  6. Forecasting Interest Rates Using Geostatistical Techniques

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2015-11-01

    Full Text Available Geostatistical spatial models are widely used in many applied fields to forecast data observed on continuous three-dimensional surfaces. We propose to extend their use to finance and, in particular, to forecasting yield curves. We present the results of an empirical application where we apply the proposed method to forecast Euro Zero Rates (2003–2014 using the Ordinary Kriging method based on the anisotropic variogram. Furthermore, a comparison with other recent methods for forecasting yield curves is proposed. The results show that the model is characterized by good levels of predictions’ accuracy and it is competitive with the other forecasting models considered.

  7. Kalman-filtered compressive sensing for high resolution estimation of anthropogenic greenhouse gas emissions from sparse measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep; Lee, Jina; Lefantzi, Sophia; Yadav, Vineet; Michalak, Anna M.; van Bloemen Waanders, Bart Gustaaf; McKenna, Sean Andrew

    2013-09-01

    The estimation of fossil-fuel CO2 emissions (ffCO2) from limited ground-based and satellite measurements of CO2 concentrations will form a key component of the monitoring of treaties aimed at the abatement of greenhouse gas emissions. The limited nature of the measured data leads to a severely-underdetermined estimation problem. If the estimation is performed at fine spatial resolutions, it can also be computationally expensive. In order to enable such estimations, advances are needed in the spatial representation of ffCO2 emissions, scalable inversion algorithms and the identification of observables to measure. To that end, we investigate parsimonious spatial parameterizations of ffCO2 emissions which can be used in atmospheric inversions. We devise and test three random field models, based on wavelets, Gaussian kernels and covariance structures derived from easily-observed proxies of human activity. In doing so, we constructed a novel inversion algorithm, based on compressive sensing and sparse reconstruction, to perform the estimation. We also address scalable ensemble Kalman filters as an inversion mechanism and quantify the impact of Gaussian assumptions inherent in them. We find that the assumption does not impact the estimates of mean ffCO2 source strengths appreciably, but a comparison with Markov chain Monte Carlo estimates show significant differences in the variance of the source strengths. Finally, we study if the very different spatial natures of biogenic and ffCO2 emissions can be used to estimate them, in a disaggregated fashion, solely from CO2 concentration measurements, without extra information from products of incomplete combustion e.g., CO. We find that this is possible during the winter months, though the errors can be as large as 50%.

  8. An Ultra-high Resolution Synthetic Precipitation Data for Ungauged Sites

    Science.gov (United States)

    Kim, Hong-Joong; Choi, Kyung-Min; Oh, Jai-Ho

    2018-05-01

    Despite the enormous damage caused by record heavy rainfall, the amount of precipitation in areas without observation points cannot be known precisely. One way to overcome these difficulties is to estimate meteorological data at ungauged sites. In this study, we have used observation data over Seoul city to calculate high-resolution (250-meter resolution) synthetic precipitation over a 10-year (2005-2014) period. Furthermore, three cases are analyzed by evaluating the rainfall intensity and performing statistical analysis over the 10-year period. In the case where the typhoon "Meari" passed to the west coast during 28-30 June 2011, the Pearson correlation coefficient was 0.93 for seven validation points, which implies that the temporal correlation between the observed precipitation and synthetic precipitation was very good. It can be confirmed that the time series of observation and synthetic precipitation in the period almost completely matches the observed rainfall. On June 28-29, 2011, the estimation of 10 to 30 mm h-1 of continuous strong precipitation was correct. In addition, it is shown that the synthetic precipitation closely follows the observed precipitation for all three cases. Statistical analysis of 10 years of data reveals a very high correlation coefficient between synthetic precipitation and observed rainfall (0.86). Thus, synthetic precipitation data show good agreement with the observations. Therefore, the 250-m resolution synthetic precipitation amount calculated in this study is useful as basic data in weather applications, such as urban flood detection.

  9. Joint estimation and contention-resolution protocol for wireless random access

    DEFF Research Database (Denmark)

    Stefanovic, Cedomir; Trillingsgaard, Kasper Fløe; Kiilerich Pratas, Nuno

    2013-01-01

    We propose a contention-based random-access protocol, designed for wireless networks where the number of users is not a priori known. The protocol operates in rounds divided into equal-duration slots, performing at the same time estimation of the number of users and resolution of their transmissi......We propose a contention-based random-access protocol, designed for wireless networks where the number of users is not a priori known. The protocol operates in rounds divided into equal-duration slots, performing at the same time estimation of the number of users and resolution...... successive interference cancellation which, coupled with the use of the optimized access probabilities, enables throughputs that are substantially higher than the traditional slotted ALOHA-like protocols. The key feature of the proposed protocol is that the round durations are not a priori set...

  10. Bayesian geostatistical analysis and prediction of Rhodesian human African trypanosomiasis.

    Directory of Open Access Journals (Sweden)

    Nicola A Wardrop

    2010-12-01

    Full Text Available The persistent spread of Rhodesian human African trypanosomiasis (HAT in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease.One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future.Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects.Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods.Predictive mapping indicates an increased risk of high HAT prevalence in the future

  11. Comparing a Multivariate Global Ocean State Estimate With High-Resolution in Situ Data: An Anticyclonic Intrathermocline Eddy Near the Canary Islands

    Directory of Open Access Journals (Sweden)

    Bàrbara Barceló-Llull

    2018-03-01

    Full Text Available The provision of high-resolution in situ oceanographic data is key for the ongoing verification, validation and assessment of operational products, such as those provided by the Copernicus Marine Core Service (CMEMS. Here we analyze the ability of ARMOR3D—a multivariate global ocean state estimate that is available from CMEMS—to reconstruct a mesoscale anticyclonic intrathermocline eddy that was previously sampled with high-resolution independent in situ observations. ARMOR3D is constructed by merging remote sensing observations with in situ vertical profiles of temperature and salinity obtained primarily from the Argo network. In situ data from CTDs and an Acoustic Doppler Current Profiler were obtained during an oceanographic cruise near the Canary Islands (Atlantic ocean. The analysis of the ARMOR3D product using the in situ data is done over (i a high-resolution meridional transect crossing the eddy center and (ii a three-dimensional grid centered on the eddy center. An evaluation of the hydrographic eddy signature and derived dynamical variables, namely geostrophic velocity, vertical vorticity and quasi-geostrophic (QG vertical velocity, demonstrates that the ARMOR3D product is able to reproduce the vertical hydrographic structure of the independently sampled eddy below the seasonal pycnocline, with the caveat that the flow is surface intensified and the seasonal pycnocline remains flat. Maps of ARMOR3D density show the signature of the eddy, and agreement with the elliptical eddy shape seen in the in situ data. The major eddy axes are oriented NW-SE in both data sets. The estimated radius for the in situ eddy is ~46 km; the ARMOR3D radius is significantly larger at ~ 92 km and is considered an overestimation that is inherited from an across-track altimetry sampling issue. The ARMOR3D geostrophic flow is underestimated by a factor of 2, with maxima of 0.11 (−0.19 m s−1 at the surface, which implies an underestimation of the local

  12. SACRA - global data sets of satellite-derived crop calendars for agricultural simulations: an estimation of a high-resolution crop calendar using satellite-sensed NDVI

    Science.gov (United States)

    Kotsuki, S.; Tanaka, K.

    2015-01-01

    To date, many studies have performed numerical estimations of food production and agricultural water demand to understand the present and future supply-demand relationship. A crop calendar (CC) is an essential input datum to estimate food production and agricultural water demand accurately with the numerical estimations. CC defines the date or month when farmers plant and harvest in cropland. This study aims to develop a new global data set of a satellite-derived crop calendar for agricultural simulations (SACRA) and reveal advantages and disadvantages of the satellite-derived CC compared to other global products. We estimate global CC at a spatial resolution of 5 min (≈10 km) using the satellite-sensed NDVI data, which corresponds well to vegetation growth and death on the land surface. We first demonstrate that SACRA shows similar spatial pattern in planting date compared to a census-based product. Moreover, SACRA reflects a variety of CC in the same administrative unit, since it uses high-resolution satellite data. However, a disadvantage is that the mixture of several crops in a grid is not considered in SACRA. We also address that the cultivation period of SACRA clearly corresponds to the time series of NDVI. Therefore, accuracy of SACRA depends on the accuracy of NDVI used for the CC estimation. Although SACRA shows different CC from a census-based product in some regions, multiple usages of the two products are useful to take into consideration the uncertainty of the CC. An advantage of SACRA compared to the census-based products is that SACRA provides not only planting/harvesting dates but also a peak date from the time series of NDVI data.

  13. Aspects of pulmonary histiocytosis X on high resolution computed tomography

    International Nuclear Information System (INIS)

    Costa, N.S.S.; Castro Lessa Angela, M.T. de; Angelo Junior, J.R.L.; Silva, F.M.D.; Kavakama, J.; Carvalho, C.R.R. de; Cerri, G.G.

    1995-01-01

    Pulmonary histiocytosis X is a disease that occurs in young adults and presents with nodules and cysts, mainly in upper lobes, with consequent pulmonary fibrosis. These pulmonary changes are virtually pathognomonic findings on high resolution computed tomography, that allows estimate the area of the lung involved and distinguish histiocytosis X from other disorders that also produces nodules and cysts. (author). 10 refs, 2 tabs, 6 figs

  14. Soil Moisture Estimation in South-Eastern New Mexico Using High Resolution Synthetic Aperture Radar (SAR Data

    Directory of Open Access Journals (Sweden)

    A.K.M. Azad Hossain

    2016-01-01

    Full Text Available Soil moisture monitoring and characterization of the spatial and temporal variability of this hydrologic parameter at scales from small catchments to large river basins continues to receive much attention, reflecting its critical role in subsurface-land surface-atmospheric interactions and its importance to drought analysis, irrigation planning, crop yield forecasting, flood protection, and forest fire prevention. Synthetic Aperture Radar (SAR data acquired at different spatial resolutions have been successfully used to estimate soil moisture in different semi-arid areas of the world for many years. This research investigated the potential of linear multiple regressions and Artificial Neural Networks (ANN based models that incorporate different geophysical variables with Radarsat 1 SAR fine imagery and concurrently measured soil moisture measurements to estimate surface soil moisture in Nash Draw, NM. An artificial neural network based model with vegetation density, soil type, and elevation data as input in addition to radar backscatter values was found suitable to estimate surface soil moisture in this area with reasonable accuracy. This model was applied to a time series of SAR data acquired in 2006 to produce soil moisture data covering a normal wet season in the study site.

  15. A high-resolution and observationally constrained OMI NO2 satellite retrieval

    International Nuclear Information System (INIS)

    Goldberg, Daniel L.; Lamsal, Lok N.; Loughner, Christopher P.

    2017-01-01

    Here, this work presents a new high-resolution NO 2 dataset derived from the NASA Ozone Monitoring Instrument (OMI) NO 2 version 3.0 retrieval that can be used to estimate surface-level concentrations. The standard NASA product uses NO 2 vertical profile shape factors from a 1.25° × 1° (~110 km × 110 km) resolution Global Model Initiative (GMI) model simulation to calculate air mass factors, a critical value used to determine observed tropospheric NO 2 vertical columns. To better estimate vertical profile shape factors, we use a high-resolution (1.33 km × 1.33 km) Community Multi-scale Air Quality (CMAQ) model simulation constrained by in situ aircraft observations to recalculate tropospheric air mass factors and tropospheric NO 2 vertical columns during summertime in the eastern US. In this new product, OMI NO 2 tropospheric columns increase by up to 160% in city centers and decrease by 20–50 % in the rural areas outside of urban areas when compared to the operational NASA product. Our new product shows much better agreement with the Pandora NO 2 and Airborne Compact Atmospheric Mapper (ACAM) NO 2 spectrometer measurements acquired during the DISCOVER-AQ Maryland field campaign. Furthermore, the correlation between our satellite product and EPA NO 2 monitors in urban areas has improved dramatically: r 2 = 0.60 in the new product vs. r 2 = 0.39 in the operational product, signifying that this new product is a better indicator of surface concentrations than the operational product. Our work emphasizes the need to use both high-resolution and high-fidelity models in order to recalculate satellite data in areas with large spatial heterogeneities in NO x emissions. Although the current work is focused on the eastern US, the methodology developed in this work can be applied to other world regions to produce high-quality region-specific NO 2 satellite retrievals.

  16. Estimating reaction rate constants: comparison between traditional curve fitting and curve resolution

    NARCIS (Netherlands)

    Bijlsma, S.; Boelens, H. F. M.; Hoefsloot, H. C. J.; Smilde, A. K.

    2000-01-01

    A traditional curve fitting (TCF) algorithm is compared with a classical curve resolution (CCR) approach for estimating reaction rate constants from spectral data obtained in time of a chemical reaction. In the TCF algorithm, reaction rate constants an estimated from the absorbance versus time data

  17. HIGH-RESOLUTION IMAGING OF THE ATLBS REGIONS: THE RADIO SOURCE COUNTS

    Energy Technology Data Exchange (ETDEWEB)

    Thorat, K.; Subrahmanyan, R.; Saripalli, L.; Ekers, R. D., E-mail: kshitij@rri.res.in [Raman Research Institute, C. V. Raman Avenue, Sadashivanagar, Bangalore 560080 (India)

    2013-01-01

    The Australia Telescope Low-brightness Survey (ATLBS) regions have been mosaic imaged at a radio frequency of 1.4 GHz with 6'' angular resolution and 72 {mu}Jy beam{sup -1} rms noise. The images (centered at R.A. 00{sup h}35{sup m}00{sup s}, decl. -67 Degree-Sign 00'00'' and R.A. 00{sup h}59{sup m}17{sup s}, decl. -67 Degree-Sign 00'00'', J2000 epoch) cover 8.42 deg{sup 2} sky area and have no artifacts or imaging errors above the image thermal noise. Multi-resolution radio and optical r-band images (made using the 4 m CTIO Blanco telescope) were used to recognize multi-component sources and prepare a source list; the detection threshold was 0.38 mJy in a low-resolution radio image made with beam FWHM of 50''. Radio source counts in the flux density range 0.4-8.7 mJy are estimated, with corrections applied for noise bias, effective area correction, and resolution bias. The resolution bias is mitigated using low-resolution radio images, while effects of source confusion are removed by using high-resolution images for identifying blended sources. Below 1 mJy the ATLBS counts are systematically lower than the previous estimates. Showing no evidence for an upturn down to 0.4 mJy, they do not require any changes in the radio source population down to the limit of the survey. The work suggests that automated image analysis for counts may be dependent on the ability of the imaging to reproduce connecting emission with low surface brightness and on the ability of the algorithm to recognize sources, which may require that source finding algorithms effectively work with multi-resolution and multi-wavelength data. The work underscores the importance of using source lists-as opposed to component lists-and correcting for the noise bias in order to precisely estimate counts close to the image noise and determine the upturn at sub-mJy flux density.

  18. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    Science.gov (United States)

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping.

  19. High resolution sequence stratigraphy in China

    International Nuclear Information System (INIS)

    Zhang Shangfeng; Zhang Changmin; Yin Yanshi; Yin Taiju

    2008-01-01

    Since high resolution sequence stratigraphy was introduced into China by DENG Hong-wen in 1995, it has been experienced two development stages in China which are the beginning stage of theory research and development of theory research and application, and the stage of theoretical maturity and widely application that is going into. It is proved by practices that high resolution sequence stratigraphy plays more and more important roles in the exploration and development of oil and gas in Chinese continental oil-bearing basin and the research field spreads to the exploration of coal mine, uranium mine and other strata deposits. However, the theory of high resolution sequence stratigraphy still has some shortages, it should be improved in many aspects. The authors point out that high resolution sequence stratigraphy should be characterized quantitatively and modelized by computer techniques. (authors)

  20. EMODnet High Resolution Seabed Mapping - further developing a high resolution digital bathymetry for European seas

    Science.gov (United States)

    Schaap, D.; Schmitt, T.

    2017-12-01

    Access to marine data is a key issue for the EU Marine Strategy Framework Directive and the EU Marine Knowledge 2020 agenda and includes the European Marine Observation and Data Network (EMODnet) initiative. EMODnet aims at assembling European marine data, data products and metadata from diverse sources in a uniform way. The EMODnet Bathymetry project has developed Digital Terrain Models (DTM) for the European seas. These have been produced from survey and aggregated data sets that are indexed with metadata by adopting the SeaDataNet Catalogue services. SeaDataNet is a network of major oceanographic data centres around the European seas that manage, operate and further develop a pan-European infrastructure for marine and ocean data management. The latest EMODnet Bathymetry DTM release has a grid resolution of 1/8 arcminute and covers all European sea regions. Use has been made of circa 7800 gathered survey datasets and composite DTMs. Catalogues and the EMODnet DTM are published at the dedicated EMODnet Bathymetry portal including a versatile DTM viewing and downloading service. End December 2016 the Bathymetry project has been succeeded by EMODnet High Resolution Seabed Mapping (HRSM). This continues gathering of bathymetric in-situ data sets with extra efforts for near coastal waters and coastal zones. In addition Satellite Derived Bathymetry data are included to fill gaps in coverage of the coastal zones. The extra data and composite DTMs will increase the coverage of the European seas and its coastlines, and provide input for producing an EMODnet DTM with a common resolution of 1/16 arc minutes. The Bathymetry Viewing and Download service will be upgraded to provide a multi-resolution map and including 3D viewing. The higher resolution DTMs will also be used to determine best-estimates of the European coastline for a range of tidal levels (HAT, MHW, MSL, Chart Datum, LAT), thereby making use of a tidal model for Europe. Extra challenges will be `moving to the

  1. Bayesian reconstruction of photon interaction sequences for high-resolution PET detectors

    Energy Technology Data Exchange (ETDEWEB)

    Pratx, Guillem; Levin, Craig S [Molecular Imaging Program at Stanford, Department of Radiology, Stanford, CA (United States)], E-mail: cslevin@stanford.edu

    2009-09-07

    Realizing the full potential of high-resolution positron emission tomography (PET) systems involves accurately positioning events in which the annihilation photon deposits all its energy across multiple detector elements. Reconstructing the complete sequence of interactions of each photon provides a reliable way to select the earliest interaction because it ensures that all the interactions are consistent with one another. Bayesian estimation forms a natural framework to maximize the consistency of the sequence with the measurements while taking into account the physics of {gamma}-ray transport. An inherently statistical method, it accounts for the uncertainty in the measured energy and position of each interaction. An algorithm based on maximum a posteriori (MAP) was evaluated for computer simulations. For a high-resolution PET system based on cadmium zinc telluride detectors, 93.8% of the recorded coincidences involved at least one photon multiple-interactions event (PMIE). The MAP estimate of the first interaction was accurate for 85.2% of the single photons. This represents a two-fold reduction in the number of mispositioned events compared to minimum pair distance, a simpler yet efficient positioning method. The point-spread function of the system presented lower tails and higher peak value when MAP was used. This translated into improved image quality, which we quantified by studying contrast and spatial resolution gains.

  2. Development of AMS high resolution injector system

    International Nuclear Information System (INIS)

    Bao Yiwen; Guan Xialing; Hu Yueming

    2008-01-01

    The Beijing HI-13 tandem accelerator AMS high resolution injector system was developed. The high resolution energy achromatic system consists of an electrostatic analyzer and a magnetic analyzer, which mass resolution can reach 600 and transmission is better than 80%. (authors)

  3. Resolution enhancement of low quality videos using a high-resolution frame

    NARCIS (Netherlands)

    Pham, T.Q.; Van Vliet, L.J.; Schutte, K.

    2006-01-01

    This paper proposes an example-based Super-Resolution (SR) algorithm of compressed videos in the Discrete Cosine Transform (DCT) domain. Input to the system is a Low-Resolution (LR) compressed video together with a High-Resolution (HR) still image of similar content. Using a training set of

  4. A super-resolution approach for uncertainty estimation of PIV measurements

    NARCIS (Netherlands)

    Sciacchitano, A.; Wieneke, B.; Scarano, F.

    2012-01-01

    A super-resolution approach is proposed for the a posteriori uncertainty estimation of PIV measurements. The measured velocity field is employed to determine the displacement of individual particle images. A disparity set is built from the residual distance between paired particle images of

  5. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  6. High resolution, high speed ultrahigh vacuum microscopy

    International Nuclear Information System (INIS)

    Poppa, Helmut

    2004-01-01

    The history and future of transmission electron microscopy (TEM) is discussed as it refers to the eventual development of instruments and techniques applicable to the real time in situ investigation of surface processes with high resolution. To reach this objective, it was necessary to transform conventional high resolution instruments so that an ultrahigh vacuum (UHV) environment at the sample site was created, that access to the sample by various in situ sample modification procedures was provided, and that in situ sample exchanges with other integrated surface analytical systems became possible. Furthermore, high resolution image acquisition systems had to be developed to take advantage of the high speed imaging capabilities of projection imaging microscopes. These changes to conventional electron microscopy and its uses were slowly realized in a few international laboratories over a period of almost 40 years by a relatively small number of researchers crucially interested in advancing the state of the art of electron microscopy and its applications to diverse areas of interest; often concentrating on the nucleation, growth, and properties of thin films on well defined material surfaces. A part of this review is dedicated to the recognition of the major contributions to surface and thin film science by these pioneers. Finally, some of the important current developments in aberration corrected electron optics and eventual adaptations to in situ UHV microscopy are discussed. As a result of all the path breaking developments that have led to today's highly sophisticated UHV-TEM systems, integrated fundamental studies are now possible that combine many traditional surface science approaches. Combined investigations to date have involved in situ and ex situ surface microscopies such as scanning tunneling microscopy/atomic force microscopy, scanning Auger microscopy, and photoemission electron microscopy, and area-integrating techniques such as x-ray photoelectron

  7. Evaluating the influence of spatial resolution of Landsat predictors on the accuracy of biomass models for large-area estimation across the eastern USA

    Science.gov (United States)

    Deo, Ram K.; Domke, Grant M.; Russell, Matthew B.; Woodall, Christopher W.; Andersen, Hans-Erik

    2018-05-01

    Aboveground biomass (AGB) estimates for regional-scale forest planning have become cost-effective with the free access to satellite data from sensors such as Landsat and MODIS. However, the accuracy of AGB predictions based on passive optical data depends on spatial resolution and spatial extent of target area as fine resolution (small pixels) data are associated with smaller coverage and longer repeat cycles compared to coarse resolution data. This study evaluated various spatial resolutions of Landsat-derived predictors on the accuracy of regional AGB models at three different sites in the eastern USA: Maine, Pennsylvania-New Jersey, and South Carolina. We combined national forest inventory data with Landsat-derived predictors at spatial resolutions ranging from 30–1000 m to understand the optimal spatial resolution of optical data for large-area (regional) AGB estimation. Ten generic models were developed using the data collected in 2014, 2015 and 2016, and the predictions were evaluated (i) at the county-level against the estimates of the USFS Forest Inventory and Analysis Program which relied on EVALIDator tool and national forest inventory data from the 2009–2013 cycle and (ii) within a large number of strips (~1 km wide) predicted via LiDAR metrics at 30 m spatial resolution. The county-level estimates by the EVALIDator and Landsat models were highly related (R 2 > 0.66), although the R 2 varied significantly across sites and resolution of predictors. The mean and standard deviation of county-level estimates followed increasing and decreasing trends, respectively, with models of coarser resolution. The Landsat-based total AGB estimates were larger than the LiDAR-based total estimates within the strips, however the mean of AGB predictions by LiDAR were mostly within one-standard deviations of the mean predictions obtained from the Landsat-based model at any of the resolutions. We conclude that satellite data at resolutions up to 1000 m provide

  8. High-resolution temperature-based optimization for hyperthermia treatment planning

    International Nuclear Information System (INIS)

    Kok, H P; Haaren, P M A van; Kamer, J B Van de; Wiersma, J; Dijk, J D P Van; Crezee, J

    2005-01-01

    In regional hyperthermia, optimization techniques are valuable in order to obtain amplitude/phase settings for the applicators to achieve maximal tumour heating without toxicity to normal tissue. We implemented a temperature-based optimization technique and maximized tumour temperature with constraints on normal tissue temperature to prevent hot spots. E-field distributions are the primary input for the optimization method. Due to computer limitations we are restricted to a resolution of 1 x 1 x 1 cm 3 for E-field calculations, too low for reliable treatment planning. A major problem is the fact that hot spots at low-resolution (LR) do not always correspond to hot spots at high-resolution (HR), and vice versa. Thus, HR temperature-based optimization is necessary for adequate treatment planning and satisfactory results cannot be obtained with LR strategies. To obtain HR power density (PD) distributions from LR E-field calculations, a quasi-static zooming technique has been developed earlier at the UMC Utrecht. However, quasi-static zooming does not preserve phase information and therefore it does not provide the HR E-field information required for direct HR optimization. We combined quasi-static zooming with the optimization method to obtain a millimetre resolution temperature-based optimization strategy. First we performed a LR (1 cm) optimization and used the obtained settings to calculate the HR (2 mm) PD and corresponding HR temperature distribution. Next, we performed a HR optimization using an estimation of the new HR temperature distribution based on previous calculations. This estimation is based on the assumption that the HR and LR temperature distributions, though strongly different, respond in a similar way to amplitude/phase steering. To verify the newly obtained settings, we calculate the corresponding HR temperature distribution. This method was applied to several clinical situations and found to work very well. Deviations of this estimation method for

  9. The Application of Artificial Neural Networks to Ore Reserve Estimation at Choghart Iron Ore Deposit

    Directory of Open Access Journals (Sweden)

    Seyyed Ali Nezamolhosseini

    2017-01-01

    Full Text Available Geo-statistical methods for reserve estimation are difficult to use when stationary conditions are not satisfied. Artificial Neural Networks (ANNs provide an alternative to geo-statistical techniques while considerably reducing the processing time required for development and application. In this paper the ANNs was applied to the Choghart iron ore deposit in Yazd province of Iran. Initially, an optimum Multi Layer Perceptron (MLP was constructed to estimate the Fe grade within orebody using the whole ore data of the deposit. Sensitivity analysis was applied for a number of hidden layers and neurons, different types of activation functions and learning rules. Optimal architectures for iron grade estimation were 3-20-10-1. In order to improve the network performance, the deposit was divided into four homogenous zones. Subsequently, all sensitivity analyses were carried out on each zone.  Finally, a different optimum network was trained and Fe was estimated separately for each zone. Comparison of correlation coefficient (R and least mean squared error (MSE showed that the ANNs performed on four homogenous zones were far better than the nets applied to the overall ore body. Therefore, these optimized neural networks were used to estimate the distribution of iron grades and the iron resource in Choghart deposit. As a result of applying ANNs, the tonnage of ore for Choghart deposit is approximately estimated at 135.8 million tones with average grade of Fe at 56.14 percent. Results of reserve estimation using ANNs showed a good agreement with the geo-statistical methods applied to this ore body in another work.

  10. The role of geostatistics in medical geology

    Science.gov (United States)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say water in the United States. In the Michigan thumb region, arsenopyrite (up to 7% As by weight) has been identified in the bedrock of the Marshall Sandstone aquifer, one of the region's most productive aquifers. Epidemiologic studies have suggested a possible associationbetween exposure to inorganic arsenic and prostate cancer mortality, including a study of populations residing in Utah. The information available for the present ecological study (i.e. analysis of

  11. A high resolution solar atlas for fluorescence calculations

    Science.gov (United States)

    Hearn, M. F.; Ohlmacher, J. T.; Schleicher, D. G.

    1983-01-01

    The characteristics required of a solar atlas to be used for studying the fluorescence process in comets are examined. Several sources of low resolution data were combined to provide an absolutely calibrated spectrum from 2250 A to 7000A. Three different sources of high resolution data were also used to cover this same spectral range. The low resolution data were then used to put each high resolution spectrum on an absolute scale. The three high resolution spectra were then combined in their overlap regions to produce a single, absolutely calibrated high resolution spectrum over the entire spectral range.

  12. Eulerian and Lagrangian statistics from high resolution numerical simulations of weakly compressible turbulence

    NARCIS (Netherlands)

    Benzi, R.; Biferale, L.; Fisher, R.T.; Lamb, D.Q.; Toschi, F.

    2009-01-01

    We report a detailed study of Eulerian and Lagrangian statistics from high resolution Direct Numerical Simulations of isotropic weakly compressible turbulence. Reynolds number at the Taylor microscale is estimated to be around 600. Eulerian and Lagrangian statistics is evaluated over a huge data

  13. Applicability of geostatistical procedures for the evaluation of hydrogeological parameters of a fractured aquifer in the Ronneburg mine district; Anwendbarkeit geostatistischer Verfahren zur Beurteilung hydrogeologischer Parameter eines heterogenen Kluftaquifers im Ronneburger Bergbaurevier

    Energy Technology Data Exchange (ETDEWEB)

    Grasshoff, C.; Schetelig, K. [RWTH Aachen, Lehrstuhl fuer Ingenieurgeologie und Hydrogeologie (Germany); Tomschi, H. [Harress Pickel Consult GmbH, Huerth (Germany)

    1998-12-31

    The following paper demonstrates, how a geostatistical approach can help interpolating hydrogeological parameters over a certain area. The basic elements developed by G. Matheron in the sixties are represented as the preconditions and assumptions, which provide the best results of the estimation. The variogram as the most important tool in geostatistics offers the opportunity to describe the correlating behaviour of a regionalized variable. Some kriging procedures are briefly introduced, which provide under varying circumstances estimating of non-measured values with the theoretical variogram-model. In the Ronneburg mine district 108 screened drill-holes could provide coefficients of hydraulic conductivity. These were interpolated with ordinary kriging over the whole investigation area. An error calculation was performed, which could prove the accuracy of the estimation. Short prospects point out some difficulties handling with geostatistic procedures and make suggestions for further investigations. (orig.) [Deutsch] Der folgende Artikel soll aufzeigen, inwiefern ein geostatistischer Ansatz hilfreich ist, um hydrogeologische Parameter flaechenhaft zu interpolieren. Dabei werden die von Matheron in den sechziger Jahren entwickelten Grundlagen vorgestellt und die Voraussetzungen definiert, unter denen die geostatistischen Schaetzmethoden die besten Ergebnisse liefern. Das Variogramm, als wichtigstes Werkzeug, bietet die Moeglichkeit, die raeumliche Korrelation der untersuchten Variable zu belegen. Mehrere Kriging-Verfahren werden skizzenhaft vorgestellt, die es unter unterschiedlichen Voraussetzungen ermoeglichen, an den Stellen des Untersuchungsgebietes, wo keine Messungen vorliegen, Schaetzungen aus dem Variogramm-Modell zu errechnen. Im Ronneburger Bergbaugebiet wurden aus 108 verfilterten Bohrungen k{sub f}-Werte gewonnen, die mittels Ordinary Kriging flaechenhaft ueber das gesamte Untersuchungsgebiet interpoliert wurden. Eine Fehlerabschaetzung gibt ueber die

  14. A feasibility study of PETiPIX: an ultra high resolution small animal PET scanner

    Science.gov (United States)

    Li, K.; Safavi-Naeini, M.; Franklin, D. R.; Petasecca, M.; Guatelli, S.; Rosenfeld, A. B.; Hutton, B. F.; Lerch, M. L. F.

    2013-12-01

    PETiPIX is an ultra high spatial resolution positron emission tomography (PET) scanner designed for imaging mice brains. Four Timepix pixellated silicon detector modules are placed in an edge-on configuration to form a scanner with a field of view (FoV) 15 mm in diameter. Each detector module consists of 256 × 256 pixels with dimensions of 55 × 55 × 300 μm3. Monte Carlo simulations using GEANT4 Application for Tomographic Emission (GATE) were performed to evaluate the feasibility of the PETiPIX design, including estimation of system sensitivity, angular dependence, spatial resolution (point source, hot and cold phantom studies) and evaluation of potential detector shield designs. Initial experimental work also established that scattered photons and recoil electrons could be detected using a single edge-on Timepix detector with a positron source. Simulation results estimate a spatial resolution of 0.26 mm full width at half maximum (FWHM) at the centre of FoV and 0.29 mm FWHM overall spatial resolution with sensitivity of 0.01%, and indicate that a 1.5 mm thick tungsten shield parallel to the detectors will absorb the majority of non-coplanar annihilation photons, significantly reducing the rates of randoms. Results from the simulated phantom studies demonstrate that PETiPIX is a promising design for studies demanding high resolution images of mice brains.

  15. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    Science.gov (United States)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  16. Bayesian Geostatistical Modeling of Malaria Indicator Survey Data in Angola

    Science.gov (United States)

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006–2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  17. High-resolution SPECT for small-animal imaging

    International Nuclear Information System (INIS)

    Qi Yujin

    2006-01-01

    This article presents a brief overview of the development of high-resolution SPECT for small-animal imaging. A pinhole collimator has been used for high-resolution animal SPECT to provide better spatial resolution and detection efficiency in comparison with a parallel-hole collimator. The theory of imaging characteristics of the pinhole collimator is presented and the designs of the pinhole aperture are discussed. The detector technologies used for the development of small-animal SPECT and the recent advances are presented. The evolving trend of small-animal SPECT is toward a multi-pinhole and a multi-detector system to obtain a high resolution and also a high detection efficiency. (authors)

  18. Motion robust high resolution 3D free-breathing pulmonary MRI using dynamic 3D image self-navigator.

    Science.gov (United States)

    Jiang, Wenwen; Ong, Frank; Johnson, Kevin M; Nagle, Scott K; Hope, Thomas A; Lustig, Michael; Larson, Peder E Z

    2018-06-01

    To achieve motion robust high resolution 3D free-breathing pulmonary MRI utilizing a novel dynamic 3D image navigator derived directly from imaging data. Five-minute free-breathing scans were acquired with a 3D ultrashort echo time (UTE) sequence with 1.25 mm isotropic resolution. From this data, dynamic 3D self-navigating images were reconstructed under locally low rank (LLR) constraints and used for motion compensation with one of two methods: a soft-gating technique to penalize the respiratory motion induced data inconsistency, and a respiratory motion-resolved technique to provide images of all respiratory motion states. Respiratory motion estimation derived from the proposed dynamic 3D self-navigator of 7.5 mm isotropic reconstruction resolution and a temporal resolution of 300 ms was successful for estimating complex respiratory motion patterns. This estimation improved image quality compared to respiratory belt and DC-based navigators. Respiratory motion compensation with soft-gating and respiratory motion-resolved techniques provided good image quality from highly undersampled data in volunteers and clinical patients. An optimized 3D UTE sequence combined with the proposed reconstruction methods can provide high-resolution motion robust pulmonary MRI. Feasibility was shown in patients who had irregular breathing patterns in which our approach could depict clinically relevant pulmonary pathologies. Magn Reson Med 79:2954-2967, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  19. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    International Nuclear Information System (INIS)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Khoshdel, Hossein

    2014-01-01

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data

  20. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  1. A hard x-ray spectrometer for high angular resolution observations of cosmic sources

    International Nuclear Information System (INIS)

    Hailey, C.J.; Ziock, K.P.; Harrison, F.; Kahn, S.M.; Liedahl, D.; Lubin, P.M.; Seiffert, M.

    1988-01-01

    LAXRIS (large area x-ray imaging spectrometer) is an experimental, balloon-borne, hard x-ray telescope that consists of a coaligned array of x-ray imaging spectrometer modules capable of obtaining high angular resolution (1--3 arcminutes) with moderate energy resolution in the 20- to 300-keV region. Each spectrometer module consists of a CsI(Na) crystal coupled to a position-sensitive phototube with a crossed-wire, resistive readout. Imaging is provided by a coded aperture mask with a 4-m focal length. The high angular resolution is coupled with rather large area (/approximately/800 cm 2 ) to provide good sensitivity. Results are presented on performance and overall design. Sensitivity estimates are derived from a Monte-Carlo code developed to model the LAXRIS response to background encountered at balloon altitudes. We discuss a variety of observations made feasible by high angular resolution. For instance, spatially resolving the nonthermal x-ray emission from clusters of galaxies is suggested as an ideal program for LAXRIS. 15 refs., 5 figs

  2. Assessing TCE source bioremediation by geostatistical analysis of a flux fence.

    Science.gov (United States)

    Cai, Zuansi; Wilson, Ryan D; Lerner, David N

    2012-01-01

    Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.

  3. Geostatistical analysis of potentiometric data in Wolfcamp aquifer of the Palo Duro Basin, Texas

    International Nuclear Information System (INIS)

    Harper, W.V.; Furr, J.M.

    1986-04-01

    This report details a geostatistical analysis of potentiometric data from the Wolfcamp aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Wolfcamp data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data

  4. The performance of the new enhanced-resolution satellite passive microwave dataset applied for snow water equivalent estimation

    Science.gov (United States)

    Pan, J.; Durand, M. T.; Jiang, L.; Liu, D.

    2017-12-01

    The newly-processed NASA MEaSures Calibrated Enhanced-Resolution Brightness Temperature (CETB) reconstructed using antenna measurement response function (MRF) is considered to have significantly improved fine-resolution measurements with better georegistration for time-series observations and equivalent field of view (FOV) for frequencies with the same monomial spatial resolution. We are looking forward to its potential for the global snow observing purposes, and therefore aim to test its performance for characterizing snow properties, especially the snow water equivalent (SWE) in large areas. In this research, two candidate SWE algorithms will be tested in China for the years between 2005 to 2010 using the reprocessed TB from the Advanced Microwave Scanning Radiometer for EOS (AMSR-E), with the results to be evaluated using the daily snow depth measurements at over 700 national synoptic stations. One of the algorithms is the SWE retrieval algorithm used for the FengYun (FY) - 3 Microwave Radiation Imager. This algorithm uses the multi-channel TB to calculate SWE for three major snow regions in China, with the coefficients adapted for different land cover types. The second algorithm is the newly-established Bayesian Algorithm for SWE Estimation with Passive Microwave measurements (BASE-PM). This algorithm uses the physically-based snow radiative transfer model to find the histogram of most-likely snow property that matches the multi-frequency TB from 10.65 to 90 GHz. It provides a rough estimation of snow depth and grain size at the same time and showed a 30 mm SWE RMS error using the ground radiometer measurements at Sodankyla. This study will be the first attempt to test it spatially for satellite. The use of this algorithm benefits from the high resolution and the spatial consistency between frequencies embedded in the new dataset. This research will answer three questions. First, to what extent can CETB increase the heterogeneity in the mapped SWE? Second, will

  5. Two methods to estimate the position resolution for straw chambers with strip readout

    International Nuclear Information System (INIS)

    Golutvin, I.A.; Movchan, S.A.; Peshekhonov, V.D.; Preda, T.

    1992-01-01

    The centroid and charge-ratio methods are presented to estimate the position resolution of the straw chambers with strip readout. For the straw chambers of 10 mm in diameter, the highest position resolution was obtained for a strip pitch of 5 mm. With the centroid method and perpendicular X-ray beam, the position resolution was ≅120 μm, for the signal-to-noise ratio of 60-65. The charge-ratio method has demonstrated ≅10% better position resolution at the edges of the strip. 6 refs.; 5 figs

  6. High resolution time integration for SN radiation transport

    International Nuclear Information System (INIS)

    Thoreson, Greg; McClarren, Ryan G.; Chang, Jae H.

    2009-01-01

    First-order, second-order, and high resolution time discretization schemes are implemented and studied for the discrete ordinates (S N ) equations. The high resolution method employs a rate of convergence better than first-order, but also suppresses artificial oscillations introduced by second-order schemes in hyperbolic partial differential equations. The high resolution method achieves these properties by nonlinearly adapting the time stencil to use a first-order method in regions where oscillations could be created. We employ a quasi-linear solution scheme to solve the nonlinear equations that arise from the high resolution method. All three methods were compared for accuracy and convergence rates. For non-absorbing problems, both second-order and high resolution converged to the same solution as the first-order with better convergence rates. High resolution is more accurate than first-order and matches or exceeds the second-order method

  7. Impact of Spatial Resolution on Wind Field Derived Estimates of Air Pressure Depression in the Hurricane Eye

    Directory of Open Access Journals (Sweden)

    Linwood Jones

    2010-03-01

    Full Text Available Measurements of the near surface horizontal wind field in a hurricane with spatial resolution of order 1–10 km are possible using airborne microwave radiometer imagers. An assessment is made of the information content of the measured winds as a function of the spatial resolution of the imager. An existing algorithm is used which estimates the maximum surface air pressure depression in the hurricane eye from the maximum wind speed. High resolution numerical model wind fields from Hurricane Frances 2004 are convolved with various HIRAD antenna spatial filters to observe the impact of the antenna design on the central pressure depression in the eye that can be deduced from it.

  8. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    Science.gov (United States)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  9. High tracking resolution detectors. Final Technical Report

    International Nuclear Information System (INIS)

    Vasile, Stefan; Li, Zheng

    2010-01-01

    High-resolution tracking detectors based on Active Pixel Sensor (APS) have been valuable tools in Nuclear Physics and High-Energy Physics research, and have contributed to major discoveries. Their integration time, radiation length and readout rate is a limiting factor for the planed luminosity upgrades in nuclear and high-energy physics collider-based experiments. The goal of this program was to demonstrate and develop high-gain, high-resolution tracking detector arrays with faster readout, and shorter radiation length than APS arrays. These arrays may operate as direct charged particle detectors or as readouts of high resolution scintillating fiber arrays. During this program, we developed in CMOS large, high-resolution pixel sensor arrays with integrated readout, and reset at pixel level. Their intrinsic gain, high immunity to surface and moisture damage, will allow operating these detectors with minimal packaging/passivation requirements and will result in radiation length superior to APS. In Phase I, we designed and fabricated arrays with calorimetric output capable of sub-pixel resolution and sub-microsecond readout rate. The technical effort was dedicated to detector and readout structure development, performance verification, as well as to radiation damage and damage annealing.

  10. High-resolution 3D seismic reflection imaging across active faults and its impact on seismic hazard estimation in the Tokyo metropolitan area

    Science.gov (United States)

    Ishiyama, Tatsuya; Sato, Hiroshi; Abe, Susumu; Kawasaki, Shinji; Kato, Naoko

    2016-10-01

    We collected and interpreted high-resolution 3D seismic reflection data across a hypothesized fault scarp, along the largest active fault that could generate hazardous earthquakes in the Tokyo metropolitan area. The processed and interpreted 3D seismic cube, linked with nearby borehole stratigraphy, suggests that a monocline that deforms lower Pleistocene units is unconformably overlain by middle Pleistocene conglomerates. Judging from structural patterns and vertical separation on the lower-middle Pleistocene units and the ground surface, the hypothesized scarp was interpreted as a terrace riser rather than as a manifestation of late Pleistocene structural growth resulting from repeated fault activity. Devastating earthquake scenarios had been predicted along the fault in question based on its proximity to the metropolitan area, however our new results lead to a significant decrease in estimated fault length and consequently in the estimated magnitude of future earthquakes associated with reactivation. This suggests a greatly reduced seismic hazard in the Tokyo metropolitan area from earthquakes generated by active intraplate crustal faults.

  11. Ultra high resolution tomography

    Energy Technology Data Exchange (ETDEWEB)

    Haddad, W.S.

    1994-11-15

    Recent work and results on ultra high resolution three dimensional imaging with soft x-rays will be presented. This work is aimed at determining microscopic three dimensional structure of biological and material specimens. Three dimensional reconstructed images of a microscopic test object will be presented; the reconstruction has a resolution on the order of 1000 A in all three dimensions. Preliminary work with biological samples will also be shown, and the experimental and numerical methods used will be discussed.

  12. A global and high-resolution assessment of the green, blue and grey water footprint of wheat

    NARCIS (Netherlands)

    Mekonnen, Mesfin; Hoekstra, Arjen Ysbert

    2010-01-01

    The aim of this study is to estimate the green, blue and grey water footprint of wheat in a spatially-explicit way, both from a production and consumption perspective. The assessment is global and improves upon earlier research by taking a high-resolution approach, estimating the water footprint of

  13. Regional soil erosion assessment based on a sample survey and geostatistics

    Science.gov (United States)

    Yin, Shuiqing; Zhu, Zhengyuan; Wang, Li; Liu, Baoyuan; Xie, Yun; Wang, Guannan; Li, Yishan

    2018-03-01

    worsened the estimation when used as the covariates for the interpolation of soil loss. Due to the unavailability of a 1 : 10 000 topography map for the entire area in this study, the model assisted by the land use, R, and K factors, with a resolution of 250 m, was used to generate the regional assessment of the soil erosion for Shaanxi Province. It demonstrated that 54.3 % of total land in Shaanxi Province had annual soil loss equal to or greater than 5 t ha-1 yr-1. High (20-40 t ha-1 yr-1), severe (40-80 t ha-1 yr-1), and extreme ( > 80 t ha-1 yr-1) erosion occupied 14.0 % of the total land. The dry land and irrigated land, forest, shrubland, and grassland in Shaanxi Province had mean soil loss rates of 21.77, 3.51, 10.00, and 7.27 t ha-1 yr-1, respectively. Annual soil loss was about 207.3 Mt in Shaanxi Province, with 68.9 % of soil loss originating from the farmlands and grasslands in Yan'an and Yulin districts in the northern Loess Plateau region and Ankang and Hanzhong districts in the southern Qingba mountainous region. This methodology provides a more accurate regional soil erosion assessment and can help policymakers to take effective measures to mediate soil erosion risks.

  14. Geostatistical and adjoint sensitivity techniques applied to a conceptual model of ground-water flow in the Paradox Basin, Utah

    International Nuclear Information System (INIS)

    Metcalfe, D.E.; Campbell, J.E.; RamaRao, B.S.; Harper, W.V.; Battelle Project Management Div., Columbus, OH)

    1985-01-01

    Sensitivity and uncertainty analysis are important components of performance assessment activities for potential high-level radioactive waste repositories. The application of geostatistical and adjoint sensitivity techniques to aid in the calibration of an existing conceptual model of ground-water flow is demonstrated for the Leadville Limestone in Paradox Basin, Utah. The geostatistical method called kriging is used to statistically analyze the measured potentiometric data for the Leadville. This analysis consists of identifying anomalous data and data trends and characterizing the correlation structure between data points. Adjoint sensitivity analysis is then performed to aid in the calibration of a conceptual model of ground-water flow to the Leadville measured potentiometric data. Sensitivity derivatives of the fit between the modeled Leadville potentiometric surface and the measured potentiometric data to model parameters and boundary conditions are calculated by the adjoint method. These sensitivity derivatives are used to determine which model parameter and boundary condition values should be modified to most efficiently improve the fit of modeled to measured potentiometric conditions

  15. Estimation and quantification of mangrove forest extent by using different spatial resolution satellite data for the sandspit area of Karachi coast

    International Nuclear Information System (INIS)

    Saeed, U.; Daud, A.; Ashraf, S.; Mahmood, A.

    2006-01-01

    Mangrove forest is an integral part of inter-tidal zone of the coastal environment extending throughout the tropics and subtropics of the world. In Pakistan, for the last thirty years, remote-sensing data has significantly been used for area estimation of mangrove forests. In the previous studies medium resolution satellite data have been used for the area estimation of mangrove forests that revealed some of the discrepancies in terms of recognition of the subtle variations of landcover features in the satellite imagery. Current study aims at the classification techniques employed for the area estimation using high and medium resolution satellite imageries. To study the effects of spatial resolution on classification results, three different satellite data were used, including Quickbird, TERRA and Landsat satellites. Thematic map derived from Quickbird data was comprised of maximum number of land cover classes with a definite zone of mangroves that extends from regeneration to mature canopies. Total estimated mangroves extent was 370 ha with 57.45, 125.9, 180.89, and 5.35 ha of tall, medium, small, and new recruitment mangrove plants respectively. While mangrove area estimations from thematic maps derived using TERRA and Landsat satellite data, showed a gradual increase in the mangrove extent from 390.95 ha to 417.92 ha. This increase in area is an indicative of the fact that some of the landcover classes may have been miss-classified and hence added to the area under mangrove forests. This study also showed that high-resolution satellite data could be used for identifying different height zones of mangrove forests, along with an accurate delineation of classes like salt bushes and algae, which could not be classified otherwise. (author)

  16. Equivalent physical models and formulation of equivalent source layer in high-resolution EEG imaging

    International Nuclear Information System (INIS)

    Yao Dezhong; He Bin

    2003-01-01

    In high-resolution EEG imaging, both equivalent dipole layer (EDL) and equivalent charge layer (ECL) assumed to be located just above the cortical surface have been proposed as high-resolution imaging modalities or as intermediate steps to estimate the epicortical potential. Presented here are the equivalent physical models of these two equivalent source layers (ESL) which show that the strength of EDL is proportional to the surface potential of the layer when the outside of the layer is filled with an insulator, and that the strength of ECL is the normal current of the layer when the outside is filled with a perfect conductor. Based on these equivalent physical models, closed solutions of ECL and EDL corresponding to a dipole enclosed by a spherical layer are given. These results provide the theoretical basis of ESL applications in high-resolution EEG mapping

  17. Equivalent physical models and formulation of equivalent source layer in high-resolution EEG imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yao Dezhong [School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu City, 610054, Sichuan Province (China); He Bin [The University of Illinois at Chicago, IL (United States)

    2003-11-07

    In high-resolution EEG imaging, both equivalent dipole layer (EDL) and equivalent charge layer (ECL) assumed to be located just above the cortical surface have been proposed as high-resolution imaging modalities or as intermediate steps to estimate the epicortical potential. Presented here are the equivalent physical models of these two equivalent source layers (ESL) which show that the strength of EDL is proportional to the surface potential of the layer when the outside of the layer is filled with an insulator, and that the strength of ECL is the normal current of the layer when the outside is filled with a perfect conductor. Based on these equivalent physical models, closed solutions of ECL and EDL corresponding to a dipole enclosed by a spherical layer are given. These results provide the theoretical basis of ESL applications in high-resolution EEG mapping.

  18. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    Science.gov (United States)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  19. The fusion of satellite and UAV data: simulation of high spatial resolution band

    Science.gov (United States)

    Jenerowicz, Agnieszka; Siok, Katarzyna; Woroszkiewicz, Malgorzata; Orych, Agata

    2017-10-01

    Remote sensing techniques used in the precision agriculture and farming that apply imagery data obtained with sensors mounted on UAV platforms became more popular in the last few years due to the availability of low- cost UAV platforms and low- cost sensors. Data obtained from low altitudes with low- cost sensors can be characterised by high spatial and radiometric resolution but quite low spectral resolution, therefore the application of imagery data obtained with such technology is quite limited and can be used only for the basic land cover classification. To enrich the spectral resolution of imagery data acquired with low- cost sensors from low altitudes, the authors proposed the fusion of RGB data obtained with UAV platform with multispectral satellite imagery. The fusion is based on the pansharpening process, that aims to integrate the spatial details of the high-resolution panchromatic image with the spectral information of lower resolution multispectral or hyperspectral imagery to obtain multispectral or hyperspectral images with high spatial resolution. The key of pansharpening is to properly estimate the missing spatial details of multispectral images while preserving their spectral properties. In the research, the authors presented the fusion of RGB images (with high spatial resolution) obtained with sensors mounted on low- cost UAV platforms and multispectral satellite imagery with satellite sensors, i.e. Landsat 8 OLI. To perform the fusion of UAV data with satellite imagery, the simulation of the panchromatic bands from RGB data based on the spectral channels linear combination, was conducted. Next, for simulated bands and multispectral satellite images, the Gram-Schmidt pansharpening method was applied. As a result of the fusion, the authors obtained several multispectral images with very high spatial resolution and then analysed the spatial and spectral accuracies of processed images.

  20. A high resolution portable spectroscopy system

    International Nuclear Information System (INIS)

    Kulkarni, C.P.; Vaidya, P.P.; Paulson, M.; Bhatnagar, P.V.; Pande, S.S.; Padmini, S.

    2003-01-01

    Full text: This paper describes the system details of a High Resolution Portable Spectroscopy System (HRPSS) developed at Electronics Division, BARC. The system can be used for laboratory class, high-resolution nuclear spectroscopy applications. The HRPSS consists of a specially designed compact NIM bin, with built-in power supplies, accommodating a low power, high resolution MCA, and on-board embedded computer for spectrum building and communication. A NIM based spectroscopy amplifier and a HV module for detector bias are integrated (plug-in) in the bin. The system communicates with a host PC via a serial link. Along-with a laptop PC, and a portable HP-Ge detector, the HRPSS offers a laboratory class performance for portable applications

  1. Sub-spatial resolution position estimation for optical fibre sensing applications

    DEFF Research Database (Denmark)

    Zibar, Darko; Werzinger, Stefan; Schmauss, Bernhard

    2017-01-01

    Methods from machine learning community are employed for estimating the position of fibre Bragg gratings in an array. Using the conventional methods for position estimation, based on inverse discrete Fourier transform (IDFT), it is required that two-point spatial resolution is less than gratings...... of reflection coefficients and the positions is performed. From the practical point of view, we can demonstrate the reduction of the interrogator's bandwidth by factor of 2. The technique is demonstrated for incoherent optical frequency domain reflectometry (IOFDR). However, the approach is applicable to any...

  2. Development of high-resolution x-ray CT system using parallel beam geometry

    Energy Technology Data Exchange (ETDEWEB)

    Yoneyama, Akio, E-mail: akio.yoneyama.bu@hitachi.com; Baba, Rika [Central Research Laboratory, Hitachi Ltd., Hatoyama, Saitama (Japan); Hyodo, Kazuyuki [Institute of Materials Science, High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan); Takeda, Tohoru [School of Allied Health Sciences, Kitasato University, Sagamihara, Kanagawa (Japan); Nakano, Haruhisa; Maki, Koutaro [Department of Orthodontics, School of Dentistry Showa University, Ota-ku, Tokyo (Japan); Sumitani, Kazushi; Hirai, Yasuharu [Kyushu Synchrotron Light Research Center, Tosu, Saga (Japan)

    2016-01-28

    For fine three-dimensional observations of large biomedical and organic material samples, we developed a high-resolution X-ray CT system. The system consists of a sample positioner, a 5-μm scintillator, microscopy lenses, and a water-cooled sCMOS detector. Parallel beam geometry was adopted to attain a field of view of a few mm square. A fine three-dimensional image of birch branch was obtained using a 9-keV X-ray at BL16XU of SPring-8 in Japan. The spatial resolution estimated from the line profile of a sectional image was about 3 μm.

  3. High-Resolution Mapping of Anthropogenic Heat in China from 1992 to 2010

    Directory of Open Access Journals (Sweden)

    Wangming Yang

    2014-04-01

    Full Text Available Anthropogenic heat generated by human activity contributes to urban and regional climate warming. Due to the resolution and accuracy of existing anthropogenic heat data, it is difficult to analyze and simulate the corresponding effects. This study exploited a new method to estimate high spatial and temporal resolutions of anthropogenic heat based on long-term data of energy consumption and the US Air Force Defense Meteorological Satellite Program-Operational Linescan System (DMSP-OLS data from 1992 to 2010 across China. Our results showed that, throughout the entire study period, there are apparent increasing trends in anthropogenic heat in three major metropoli, i.e., the Beijing-Tianjin region, the Yangzi River delta and the Pearl River delta. The annual mean anthropogenic heat fluxes for Beijing, Shanghai and Guangzhou in 2010 were 17 Wm−2, 19 and 7.8 Wm−2, respectively. Comparisons with previous studies indicate that DMSP-OLS data could provide a better spatial proxy for estimating anthropogenic heat than population density and our analysis shows better performance at large scales for estimation of anthropogenic heat.

  4. Estimation of improved resolution soil moisture in vegetated areas using passive AMSR-E data

    Science.gov (United States)

    Moradizadeh, Mina; Saradjian, Mohammad R.

    2018-03-01

    Microwave remote sensing provides a unique capability for soil parameter retrievals. Therefore, various soil parameters estimation models have been developed using brightness temperature (BT) measured by passive microwave sensors. Due to the low resolution of satellite microwave radiometer data, the main goal of this study is to develop a downscaling approach to improve the spatial resolution of soil moisture estimates with the use of higher resolution visible/infrared sensor data. Accordingly, after the soil parameters have been obtained using Simultaneous Land Parameters Retrieval Model algorithm, the downscaling method has been applied to the soil moisture estimations that have been validated against in situ soil moisture data. Advance Microwave Scanning Radiometer-EOS BT data in Soil Moisture Experiment 2003 region in the south and north of Oklahoma have been used to this end. Results illustrated that the soil moisture variability is effectively captured at 5 km spatial scales without a significant degradation of the accuracy.

  5. Improving axial resolution in confocal microscopy with new high refractive index mounting media.

    Science.gov (United States)

    Fouquet, Coralie; Gilles, Jean-François; Heck, Nicolas; Dos Santos, Marc; Schwartzmann, Richard; Cannaya, Vidjeacoumary; Morel, Marie-Pierre; Davidson, Robert Stephen; Trembleau, Alain; Bolte, Susanne

    2015-01-01

    Resolution, high signal intensity and elevated signal to noise ratio (SNR) are key issues for biologists who aim at studying the localisation of biological structures at the cellular and subcellular levels using confocal microscopy. The resolution required to separate sub-cellular biological structures is often near to the resolving power of the microscope. When optimally used, confocal microscopes may reach resolutions of 180 nm laterally and 500 nm axially, however, axial resolution in depth is often impaired by spherical aberration that may occur due to refractive index mismatches. Spherical aberration results in broadening of the point-spread function (PSF), a decrease in peak signal intensity when imaging in depth and a focal shift that leads to the distortion of the image along the z-axis and thus in a scaling error. In this study, we use the novel mounting medium CFM3 (Citifluor Ltd., UK) with a refractive index of 1.518 to minimize the effects of spherical aberration. This mounting medium is compatible with most common fluorochromes and fluorescent proteins. We compare its performance with established mounting media, harbouring refractive indices below 1.500, by estimating lateral and axial resolution with sub-resolution fluorescent beads. We show furthermore that the use of the high refractive index media renders the tissue transparent and improves considerably the axial resolution and imaging depth in immuno-labelled or fluorescent protein labelled fixed mouse brain tissue. We thus propose to use those novel high refractive index mounting media, whenever optimal axial resolution is required.

  6. Improving axial resolution in confocal microscopy with new high refractive index mounting media.

    Directory of Open Access Journals (Sweden)

    Coralie Fouquet

    Full Text Available Resolution, high signal intensity and elevated signal to noise ratio (SNR are key issues for biologists who aim at studying the localisation of biological structures at the cellular and subcellular levels using confocal microscopy. The resolution required to separate sub-cellular biological structures is often near to the resolving power of the microscope. When optimally used, confocal microscopes may reach resolutions of 180 nm laterally and 500 nm axially, however, axial resolution in depth is often impaired by spherical aberration that may occur due to refractive index mismatches. Spherical aberration results in broadening of the point-spread function (PSF, a decrease in peak signal intensity when imaging in depth and a focal shift that leads to the distortion of the image along the z-axis and thus in a scaling error. In this study, we use the novel mounting medium CFM3 (Citifluor Ltd., UK with a refractive index of 1.518 to minimize the effects of spherical aberration. This mounting medium is compatible with most common fluorochromes and fluorescent proteins. We compare its performance with established mounting media, harbouring refractive indices below 1.500, by estimating lateral and axial resolution with sub-resolution fluorescent beads. We show furthermore that the use of the high refractive index media renders the tissue transparent and improves considerably the axial resolution and imaging depth in immuno-labelled or fluorescent protein labelled fixed mouse brain tissue. We thus propose to use those novel high refractive index mounting media, whenever optimal axial resolution is required.

  7. High Resolution Elevation Contours

    Data.gov (United States)

    Minnesota Department of Natural Resources — This dataset contains contours generated from high resolution data sources such as LiDAR. Generally speaking this data is 2 foot or less contour interval.

  8. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ying-Xu [Department of Chemistry, University of Bergen, PO Box 7803, N-5020 Bergen (Norway); Mjøs, Svein Are, E-mail: svein.mjos@kj.uib.no [Department of Chemistry, University of Bergen, PO Box 7803, N-5020 Bergen (Norway); David, Fabrice P.A. [Bioinformatics and Biostatistics Core Facility, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne (EPFL) and Swiss Institute of Bioinformatics (SIB), Lausanne (Switzerland); Schmid, Adrien W. [Proteomics Core Facility, Ecole Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne (Switzerland)

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. - Highlights: • A flexible strategy for analyzing MS and LC-MS data of lipid molecules is proposed. • Isotope distribution spectra of theoretically possible compounds were generated. • High resolution MS and LC-MS data were resolved by least squares spectral resolution. • The method proposed compounds that are likely to occur in the analyzed samples. • The proposed compounds matched results from manual interpretation of fragment spectra.

  9. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data

    International Nuclear Information System (INIS)

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P.A.; Schmid, Adrien W.

    2016-01-01

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. - Highlights: • A flexible strategy for analyzing MS and LC-MS data of lipid molecules is proposed. • Isotope distribution spectra of theoretically possible compounds were generated. • High resolution MS and LC-MS data were resolved by least squares spectral resolution. • The method proposed compounds that are likely to occur in the analyzed samples. • The proposed compounds matched results from manual interpretation of fragment spectra.

  10. Ultra-high resolution coded wavefront sensor

    KAUST Repository

    Wang, Congli

    2017-06-08

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  11. Feasibility of high temporal resolution breast DCE-MRI using compressed sensing theory.

    Science.gov (United States)

    Wang, Haoyu; Miao, Yanwei; Zhou, Kun; Yu, Yanming; Bao, Shanglian; He, Qiang; Dai, Yongming; Xuan, Stephanie Y; Tarabishy, Bisher; Ye, Yongquan; Hu, Jiani

    2010-09-01

    To investigate the feasibility of high temporal resolution breast DCE-MRI using compressed sensing theory. Two experiments were designed to investigate the feasibility of using reference image based compressed sensing (RICS) technique in DCE-MRI of the breast. The first experiment examined the capability of RICS to faithfully reconstruct uptake curves using undersampled data sets extracted from fully sampled clinical breast DCE-MRI data. An average approach and an approach using motion estimation and motion compensation (ME/MC) were implemented to obtain reference images and to evaluate their efficacy in reducing motion related effects. The second experiment, an in vitro phantom study, tested the feasibility of RICS for improving temporal resolution without degrading the spatial resolution. For the uptake-curve reconstruction experiment, there was a high correlation between uptake curves reconstructed from fully sampled data by Fourier transform and from undersampled data by RICS, indicating high similarity between them. The mean Pearson correlation coefficients for RICS with the ME/MC approach and RICS with the average approach were 0.977 +/- 0.023 and 0.953 +/- 0.031, respectively. The comparisons of final reconstruction results between RICS with the average approach and RICS with the ME/MC approach suggested that the latter was superior to the former in reducing motion related effects. For the in vitro experiment, compared to the fully sampled method, RICS improved the temporal resolution by an acceleration factor of 10 without degrading the spatial resolution. The preliminary study demonstrates the feasibility of RICS for faithfully reconstructing uptake curves and improving temporal resolution of breast DCE-MRI without degrading the spatial resolution.

  12. Limitations to depth resolution in high-energy, heavy-ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Elliman, R.G.; Palmer, G.R.; Ophel, T.R.; Timmers, H.

    1998-01-01

    The depth resolution of heavy-ion elastic recoil detection analysis was examined for Al and Co thin films ranging in thickness from 100 to 400 nm. Measurements were performed with 154 MeV Au ions as the incident beam, and recoils were detected using a gas ionisation detector. Energy spectra were extracted for the Al and Co recoils and the depth resolution determined as a function of film thickness from the width of the high- and low- energy edges. These results were compared with theoretical estimates calculated using the computer program DEPTH. (authors)

  13. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    Science.gov (United States)

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Geostatistical Spatio-Time model of crime in el Salvador: Structural and Predictive Analysis

    Directory of Open Access Journals (Sweden)

    Welman Rosa Alvarado

    2011-07-01

    Full Text Available Today, to study a geospatial and spatio-temporal phenomena requires searching statistical tools that enable the analysis of the dependency of space, time and interactions. The science that studies this kind of subjects is the Geoestatics which the goal is to predict spatial phenomenon. This science is considered the base for modeling phenomena that involves interactions between space and time. In the past 10 years, the Geostatistic had seen a great development in areas like the geology, soils, remote sensing, epidemiology, agriculture, ecology, economy, etc. In this research, the geostatistic had been apply to build a predictive map about crime in El Salvador; for that the variability of space and time together is studied to generate crime scenarios: crime hot spots are determined, crime vulnerable groups are identified, to improve political decisions and facilitate to decision makers about the insecurity in the country.

  15. Novel Geometrical Concept of a High Performance Brain PET Scanner Principle, Design and Performance Estimates

    CERN Document Server

    Séguinot, Jacques; Chesi, Enrico Guido; Joram, C; Mathot, S; Weilhammer, P; Chamizo-Llatas, M; Correia, J G; Ribeiro da Silva, M; Garibaldi, F; De Leo, R; Nappi, E; Corsi, F; Dragone, A; Schoenahl, F; Zaidi, H

    2006-01-01

    We present the principle, a possible implementation and performance estimates of a novel geometrical concept for a high resolution positron emission tomograph. The concept, which can for example be implemented in a brain PET device, promisses to lead to an essentially parallax free 3D image reconstruction with excellent spatial resolution and constrast, uniform over the complete field of view. The key components are matrices of long axially oriented scintillator crystals which are read out at both extremities by segmented Hybrid Photon Detectors. We discuss the relevant design considerations for a 3D axial PET camera module, motivate parameter and material choices, and estimate its performance in terms of spatial and energy resolution. We support these estimates by Monte Carlo simulations and in some cases by first experimental results. From the performance of a camera module, we extrapolate to the reconstruction resolution of a 3D axial PET scanner in a semi-analytical way and compare it to an existing state...

  16. High resolution data acquisition

    Science.gov (United States)

    Thornton, Glenn W.; Fuller, Kenneth R.

    1993-01-01

    A high resolution event interval timing system measures short time intervals such as occur in high energy physics or laser ranging. Timing is provided from a clock (38) pulse train (37) and analog circuitry (44) for generating a triangular wave (46) synchronously with the pulse train (37). The triangular wave (46) has an amplitude and slope functionally related to the time elapsed during each clock pulse in the train. A converter (18, 32) forms a first digital value of the amplitude and slope of the triangle wave at the start of the event interval and a second digital value of the amplitude and slope of the triangle wave at the end of the event interval. A counter (26) counts the clock pulse train (37) during the interval to form a gross event interval time. A computer (52) then combines the gross event interval time and the first and second digital values to output a high resolution value for the event interval.

  17. High resolution time integration for Sn radiation transport

    International Nuclear Information System (INIS)

    Thoreson, Greg; McClarren, Ryan G.; Chang, Jae H.

    2008-01-01

    First order, second order and high resolution time discretization schemes are implemented and studied for the S n equations. The high resolution method employs a rate of convergence better than first order, but also suppresses artificial oscillations introduced by second order schemes in hyperbolic differential equations. All three methods were compared for accuracy and convergence rates. For non-absorbing problems, both second order and high resolution converged to the same solution as the first order with better convergence rates. High resolution is more accurate than first order and matches or exceeds the second order method. (authors)

  18. Improved accuracy of quantitative parameter estimates in dynamic contrast-enhanced CT study with low temporal resolution

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sun Mo, E-mail: Sunmo.Kim@rmp.uhn.on.ca [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9 (Canada); Haider, Masoom A. [Department of Medical Imaging, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5, Canada and Department of Medical Imaging, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Jaffray, David A. [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9, Canada and Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Yeung, Ivan W. T. [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9 (Canada); Department of Medical Physics, Stronach Regional Cancer Centre, Southlake Regional Health Centre, Newmarket, Ontario L3Y 2P9 (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9 (Canada)

    2016-01-15

    Purpose: A previously proposed method to reduce radiation dose to patient in dynamic contrast-enhanced (DCE) CT is enhanced by principal component analysis (PCA) filtering which improves the signal-to-noise ratio (SNR) of time-concentration curves in the DCE-CT study. The efficacy of the combined method to maintain the accuracy of kinetic parameter estimates at low temporal resolution is investigated with pixel-by-pixel kinetic analysis of DCE-CT data. Methods: The method is based on DCE-CT scanning performed with low temporal resolution to reduce the radiation dose to the patient. The arterial input function (AIF) with high temporal resolution can be generated with a coarsely sampled AIF through a previously published method of AIF estimation. To increase the SNR of time-concentration curves (tissue curves), first, a region-of-interest is segmented into squares composed of 3 × 3 pixels in size. Subsequently, the PCA filtering combined with a fraction of residual information criterion is applied to all the segmented squares for further improvement of their SNRs. The proposed method was applied to each DCE-CT data set of a cohort of 14 patients at varying levels of down-sampling. The kinetic analyses using the modified Tofts’ model and singular value decomposition method, then, were carried out for each of the down-sampling schemes between the intervals from 2 to 15 s. The results were compared with analyses done with the measured data in high temporal resolution (i.e., original scanning frequency) as the reference. Results: The patients’ AIFs were estimated to high accuracy based on the 11 orthonormal bases of arterial impulse responses established in the previous paper. In addition, noise in the images was effectively reduced by using five principal components of the tissue curves for filtering. Kinetic analyses using the proposed method showed superior results compared to those with down-sampling alone; they were able to maintain the accuracy in the

  19. Estimation of Diurnal Cycle of Land Surface Temperature at High Temporal and Spatial Resolution from Clear-Sky MODIS Data

    Directory of Open Access Journals (Sweden)

    Si-Bo Duan

    2014-04-01

    Full Text Available The diurnal cycle of land surface temperature (LST is an important element of the climate system. Geostationary satellites can provide the diurnal cycle of LST with low spatial resolution and incomplete global coverage, which limits its applications in some studies. In this study, we propose a method to estimate the diurnal cycle of LST at high temporal and spatial resolution from clear-sky MODIS data. This method was evaluated using the MSG-SEVIRI-derived LSTs. The results indicate that this method fits the diurnal cycle of LST well, with root mean square error (RMSE values less than 1 K for most pixels. Because MODIS provides at most four observations per day at a given location, this method was further evaluated using only four MSG-SEVIRI-derived LSTs corresponding to the MODIS overpass times (10:30, 13:30, 22:30, and 01:30 local solar time. The results show that the RMSE values using only four MSG-SEVIRI-derived LSTs are approximately two times larger than those using all LSTs. The spatial distribution of the modeled LSTs at the MODIS pixel scale is presented from 07:00 to 05:00 local solar time of the next day with an increment of 2 hours. The diurnal cycle of the modeled LSTs describes the temporal evolution of the LSTs at the MODIS pixel scale.

  20. Structure of high-resolution NMR spectra

    CERN Document Server

    Corio, PL

    2012-01-01

    Structure of High-Resolution NMR Spectra provides the principles, theories, and mathematical and physical concepts of high-resolution nuclear magnetic resonance spectra.The book presents the elementary theory of magnetic resonance; the quantum mechanical theory of angular momentum; the general theory of steady state spectra; and multiple quantum transitions, double resonance and spin echo experiments.Physicists, chemists, and researchers will find the book a valuable reference text.

  1. Estimation of excess mortality due to long-term exposure to PM2.5 in Japan using a high-resolution model for present and future scenarios

    Science.gov (United States)

    Goto, Daisuke; Ueda, Kayo; Ng, Chris Fook Sheng; Takami, Akinori; Ariga, Toshinori; Matsuhashi, Keisuke; Nakajima, Teruyuki

    2016-09-01

    Particulate matter with a diameter of less than 2.5 μm, known as PM2.5, can affect human health, especially in elderly people. Because of the imminent aging of society in the near future in most developed countries, the human health impacts of PM2.5 must be evaluated. In this study, we used a global-to-regional atmospheric transport model to simulate PM2.5 in Japan with a high-resolution stretched grid system (∼10 km for the high-resolution model, HRM) for the present (the 2000) and the future (the 2030, as proposed by the Representative Concentrations Pathway 4.5, RCP4.5). We also used the same model with a low-resolution uniform grid system (∼100 km for the low-resolution model, LRM). These calculations were conducted by nudging meteorological fields obtained from an atmosphere-ocean coupled model and providing emission inventories used in the coupled model. After correcting for bias, we calculated the excess mortality due to long-term exposure to PM2.5 among the elderly (over 65 years old) based on different minimum PM2.5 concentration (MINPM) levels to account for uncertainty using the simulated PM2.5 distributions to express the health effect as a concentration-response function. As a result, we estimated the excess mortality for all of Japan to be 31,300 (95% confidence intervals: 20,700 to 42,600) people in 2000 and 28,600 (95% confidence intervals: 19,000 to 38,700) people in 2030 using the HRM with a MINPM of 5.8 μg/m3. In contrast, the LRM resulted in underestimates of approximately 30% (for PM2.5 concentrations in the 2000 and 2030), approximately 60% (excess mortality in the 2000) and approximately 90% (excess mortality in 2030) compared to the HRM results. We also found that the uncertainty in the MINPM value, especially for low PM2.5 concentrations in the future (2030) can cause large variability in the estimates, ranging from 0 (MINPM of 15 μg/m3 in both HRM and LRM) to 95,000 (MINPM of 0 μg/m3 in HRM) people.

  2. Geostatistical evaluation of travel time uncertainties

    International Nuclear Information System (INIS)

    Devary, J.L.

    1983-08-01

    Data on potentiometric head and hydraulic conductivity, gathered from the Wolfcamp Formation of the Permian System, have exhibited tremendous spatial variability as a result of heterogeneities in the media and the presence of petroleum and natural gas deposits. Geostatistical data analysis and error propagation techniques (kriging and conditional simulation) were applied to determine the effect of potentiometric head uncertainties on radionuclide travel paths and travel times through the Wolfcamp Formation. Blok-average kriging was utilized to remove measurement error from potentiometric head data. The travel time calculations have been enhanced by the use of an inverse technique to determine the relative hydraulic conductivity along travel paths. In this way, the spatial variability of the hydraulic conductivity corresponding to streamline convergence and divergence may be included in the analysis. 22 references, 11 figures, 1 table

  3. Ordinal Regression Based Subpixel Shift Estimation for Video Super-Resolution

    Directory of Open Access Journals (Sweden)

    Petrovic Nemanja

    2007-01-01

    Full Text Available We present a supervised learning-based approach for subpixel motion estimation which is then used to perform video super-resolution. The novelty of this work is the formulation of the problem of subpixel motion estimation in a ranking framework. The ranking formulation is a variant of classification and regression formulation, in which the ordering present in class labels namely, the shift between patches is explicitly taken into account. Finally, we demonstrate the applicability of our approach on superresolving synthetically generated images with global subpixel shifts and enhancing real video frames by accounting for both local integer and subpixel shifts.

  4. High-resolution multi-slice PET

    International Nuclear Information System (INIS)

    Yasillo, N.J.; Chintu Chen; Ordonez, C.E.; Kapp, O.H.; Sosnowski, J.; Beck, R.N.

    1992-01-01

    This report evaluates the progress to test the feasibility and to initiate the design of a high resolution multi-slice PET system. The following specific areas were evaluated: detector development and testing; electronics configuration and design; mechanical design; and system simulation. The design and construction of a multiple-slice, high-resolution positron tomograph will provide substantial improvements in the accuracy and reproducibility of measurements of the distribution of activity concentrations in the brain. The range of functional brain research and our understanding of local brain function will be greatly extended when the development of this instrumentation is completed

  5. High resolution NMR spectroscopy of synthetic polymers in bulk

    International Nuclear Information System (INIS)

    Komorski, R.A.

    1986-01-01

    The contents of this book are: Overview of high-resolution NMR of solid polymers; High-resolution NMR of glassy amorphous polymers; Carbon-13 solid-state NMR of semicrystalline polymers; Conformational analysis of polymers of solid-state NMR; High-resolution NMR studies of oriented polymers; High-resolution solid-state NMR of protons in polymers; and Deuterium NMR of solid polymers. This work brings together the various approaches for high-resolution NMR studies of bulk polymers into one volume. Heavy emphasis is, of course, given to 13C NMR studies both above and below Tg. Standard high-power pulse and wide-line techniques are not covered

  6. High resolution integral holography using Fourier ptychographic approach.

    Science.gov (United States)

    Li, Zhaohui; Zhang, Jianqi; Wang, Xiaorui; Liu, Delian

    2014-12-29

    An innovative approach is proposed for calculating high resolution computer generated integral holograms by using the Fourier Ptychographic (FP) algorithm. The approach initializes a high resolution complex hologram with a random guess, and then stitches together low resolution multi-view images, synthesized from the elemental images captured by integral imaging (II), to recover the high resolution hologram through an iterative retrieval with FP constrains. This paper begins with an analysis of the principle of hologram synthesis from multi-projections, followed by an accurate determination of the constrains required in the Fourier ptychographic integral-holography (FPIH). Next, the procedure of the approach is described in detail. Finally, optical reconstructions are performed and the results are demonstrated. Theoretical analysis and experiments show that our proposed approach can reconstruct 3D scenes with high resolution.

  7. Towards high resolution mapping of 3-D mesoscale dynamics from observations

    Directory of Open Access Journals (Sweden)

    B. Buongiorno Nardelli

    2012-10-01

    Full Text Available The MyOcean R&D project MESCLA (MEsoSCaLe dynamical Analysis through combined model, satellite and in situ data was devoted to the high resolution 3-D retrieval of tracer and velocity fields in the oceans, based on the combination of in situ and satellite observations and quasi-geostrophic dynamical models. The retrieval techniques were also tested and compared with the output of a primitive equation model, with particular attention to the accuracy of the vertical velocity field as estimated through the Q vector formulation of the omega equation. The project focused on a test case, covering the region where the Gulf Stream separates from the US East Coast. This work demonstrated that innovative methods for the high resolution mapping of 3-D mesoscale dynamics from observations can be used to build the next generations of operational observation-based products.

  8. Test of a high resolution drift chamber prototype

    International Nuclear Information System (INIS)

    Comminchau, V.; Deutschmann, M.; Draheim, K.J.; Fritze, P.; Hangarter, K.; Hawelka, P.; Herten, U.; Tonutti, M.; Anderhub, H.; Fehlmann, J.; Hofer, H.; Klein, M.; Paradiso, J.A.; Schreiber, J.; Viertel, G.

    1984-06-01

    The performance of a drift chamber prototype for a colliding beam vertex detector in a test beam at DESY is described. At one (two) atmosphere gas pressure a spatial resolution of 40 μm (30 μm) per wire for one cm drift length was achieved with a 100 MHz Flash-ADC system. An excellent double track resolution of better than 300 μm over the full drift length of 5 cm can be estimated. (orig.)

  9. Energetics of small scale turbulence in the lower stratosphere from high resolution radar measurements

    Directory of Open Access Journals (Sweden)

    J. Dole

    2001-08-01

    Full Text Available Very high resolution radar measurements were performed in the troposphere and lower stratosphere by means of the PROUST radar. The PROUST radar operates in the UHF band (961 MHz and is located in St. Santin, France (44°39’ N, 2°12’ E. A field campaign involving high resolution balloon measurements and the PROUST radar was conducted during April 1998. Under the classical hypothesis that refractive index inhomogeneities at half radar wavelength lie within the inertial subrange, assumed to be isotropic, kinetic energy and temperature variance dissipation rates were estimated independently in the lower stratosphere. The dissipation rate of temperature variance is proportional to the dissipation rate of available potential energy. We therefore estimate the ratio of dissipation rates of potential to kinetic energy. This ratio is a key parameter of atmospheric turbulence which, in locally homogeneous and stationary conditions, is simply related to the flux Richardson number, Rf .Key words. Meteorology and atmospheric dynamics (turbulence – Radio science (remote sensing

  10. High-spatial resolution and high-spectral resolution detector for use in the measurement of solar flare hard x rays

    International Nuclear Information System (INIS)

    Desai, U.D.; Orwig, L.E.

    1988-01-01

    In the areas of high spatial resolution, the evaluation of a hard X-ray detector with 65 micron spatial resolution for operation in the energy range from 30 to 400 keV is proposed. The basic detector is a thick large-area scintillator faceplate, composed of a matrix of high-density scintillating glass fibers, attached to a proximity type image intensifier tube with a resistive-anode digital readout system. Such a detector, combined with a coded-aperture mask, would be ideal for use as a modest-sized hard X-ray imaging instrument up to X-ray energies as high as several hundred keV. As an integral part of this study it was also proposed that several techniques be critically evaluated for X-ray image coding which could be used with this detector. In the area of high spectral resolution, it is proposed to evaluate two different types of detectors for use as X-ray spectrometers for solar flares: planar silicon detectors and high-purity germanium detectors (HPGe). Instruments utilizing these high-spatial-resolution detectors for hard X-ray imaging measurements from 30 to 400 keV and high-spectral-resolution detectors for measurements over a similar energy range would be ideally suited for making crucial solar flare observations during the upcoming maximum in the solar cycle

  11. High-resolution mapping of motor vehicle carbon dioxide emissions

    Science.gov (United States)

    McDonald, Brian C.; McBride, Zoe C.; Martin, Elliot W.; Harley, Robert A.

    2014-05-01

    A fuel-based inventory for vehicle emissions is presented for carbon dioxide (CO2) and mapped at various spatial resolutions (10 km, 4 km, 1 km, and 500 m) using fuel sales and traffic count data. The mapping is done separately for gasoline-powered vehicles and heavy-duty diesel trucks. Emission estimates from this study are compared with the Emissions Database for Global Atmospheric Research (EDGAR) and VULCAN. All three inventories agree at the national level within 5%. EDGAR uses road density as a surrogate to apportion vehicle emissions, which leads to 20-80% overestimates of on-road CO2 emissions in the largest U.S. cities. High-resolution emission maps are presented for Los Angeles, New York City, San Francisco-San Jose, Houston, and Dallas-Fort Worth. Sharp emission gradients that exist near major highways are not apparent when emissions are mapped at 10 km resolution. High CO2 emission fluxes over highways become apparent at grid resolutions of 1 km and finer. Temporal variations in vehicle emissions are characterized using extensive day- and time-specific traffic count data and are described over diurnal, day of week, and seasonal time scales. Clear differences are observed when comparing light- and heavy-duty vehicle traffic patterns and comparing urban and rural areas. Decadal emission trends were analyzed from 2000 to 2007 when traffic volumes were increasing and a more recent period (2007-2010) when traffic volumes declined due to recession. We found large nonuniform changes in on-road CO2 emissions over a period of 5 years, highlighting the importance of timely updates to motor vehicle emission inventories.

  12. Spatial and temporal distribution of soil-transmitted helminth infection in sub-Saharan Africa: a systematic review and geostatistical meta-analysis.

    Science.gov (United States)

    Karagiannis-Voules, Dimitrios-Alexios; Biedermann, Patricia; Ekpo, Uwem F; Garba, Amadou; Langer, Erika; Mathieu, Els; Midzi, Nicholas; Mwinzi, Pauline; Polderman, Anton M; Raso, Giovanna; Sacko, Moussa; Talla, Idrissa; Tchuenté, Louis-Albert Tchuem; Touré, Seydou; Winkler, Mirko S; Utzinger, Jürg; Vounatsou, Penelope

    2015-01-01

    Interest is growing in predictive risk mapping for neglected tropical diseases (NTDs), particularly to scale up preventive chemotherapy, surveillance, and elimination efforts. Soil-transmitted helminths (hookworm, Ascaris lumbricoides, and Trichuris trichiura) are the most widespread NTDs, but broad geographical analyses are scarce. We aimed to predict the spatial and temporal distribution of soil-transmitted helminth infections, including the number of infected people and treatment needs, across sub-Saharan Africa. We systematically searched PubMed, Web of Knowledge, and African Journal Online from inception to Dec 31, 2013, without language restrictions, to identify georeferenced surveys. We extracted data from household surveys on sources of drinking water, sanitation, and women's level of education. Bayesian geostatistical models were used to align the data in space and estimate risk of with hookworm, A lumbricoides, and T trichiura over a grid of roughly 1 million pixels at a spatial resolution of 5 × 5 km. We calculated anthelmintic treatment needs on the basis of WHO guidelines (treatment of all school-aged children once per year where prevalence in this population is 20-50% or twice per year if prevalence is greater than 50%). We identified 459 relevant survey reports that referenced 6040 unique locations. We estimate that the prevalence of hookworm, A lumbricoides, and T trichiura among school-aged children from 2000 onwards was 16·5%, 6·6%, and 4·4%. These estimates are between 52% and 74% lower than those in surveys done before 2000, and have become similar to values for the entire communities. We estimated that 126 million doses of anthelmintic treatments are required per year. Patterns of soil-transmitted helminth infection in sub-Saharan Africa have changed and the prevalence of infection has declined substantially in this millennium, probably due to socioeconomic development and large-scale deworming programmes. The global control strategy

  13. Estimating structure quality trends in the Protein Data Bank by equivalent resolution.

    Science.gov (United States)

    Bagaria, Anurag; Jaravine, Victor; Güntert, Peter

    2013-10-01

    The quality of protein structures obtained by different experimental and ab-initio calculation methods varies considerably. The methods have been evolving over time by improving both experimental designs and computational techniques, and since the primary aim of these developments is the procurement of reliable and high-quality data, better techniques resulted on average in an evolution toward higher quality structures in the Protein Data Bank (PDB). Each method leaves a specific quantitative and qualitative "trace" in the PDB entry. Certain information relevant to one method (e.g. dynamics for NMR) may be lacking for another method. Furthermore, some standard measures of quality for one method cannot be calculated for other experimental methods, e.g. crystal resolution or NMR bundle RMSD. Consequently, structures are classified in the PDB by the method used. Here we introduce a method to estimate a measure of equivalent X-ray resolution (e-resolution), expressed in units of Å, to assess the quality of any type of monomeric, single-chain protein structure, irrespective of the experimental structure determination method. We showed and compared the trends in the quality of structures in the Protein Data Bank over the last two decades for five different experimental techniques, excluding theoretical structure predictions. We observed that as new methods are introduced, they undergo a rapid method development evolution: within several years the e-resolution score becomes similar for structures obtained from the five methods and they improve from initially poor performance to acceptable quality, comparable with previously established methods, the performance of which is essentially stable. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. High resolution photoelectron spectroscopy

    International Nuclear Information System (INIS)

    Arko, A.J.

    1988-01-01

    Photoelectron Spectroscopy (PES) covers a very broad range of measurements, disciplines, and interests. As the next generation light source, the FEL will result in improvements over the undulator that are larger than the undulater improvements over bending magnets. The combination of high flux and high inherent resolution will result in several orders of magnitude gain in signal to noise over measurements using synchrotron-based undulators. The latter still require monochromators. Their resolution is invariably strongly energy-dependent so that in the regions of interest for many experiments (h upsilon > 100 eV) they will not have a resolving power much over 1000. In order to study some of the interesting phenomena in actinides (heavy fermions e.g.) one would need resolving powers of 10 4 to 10 5 . These values are only reachable with the FEL

  15. Modeling the potential area of occupancy at fine resolution may reduce uncertainty in species range estimates

    DEFF Research Database (Denmark)

    Jiménez-Alfaro, Borja; Draper, David; Nogues, David Bravo

    2012-01-01

    and maximum entropy modeling to assess whether different sampling (expert versus systematic surveys) may affect AOO estimates based on habitat suitability maps, and the differences between such measurements and traditional coarse-grid methods. Fine-scale models performed robustly and were not influenced...... by survey protocols, providing similar habitat suitability outputs with high spatial agreement. Model-based estimates of potential AOO were significantly smaller than AOO measures obtained from coarse-scale grids, even if the first were obtained from conservative thresholds based on the Minimal Predicted...... permit comparable measures among species. We conclude that estimates of AOO based on fine-resolution distribution models are more robust tools for risk assessment than traditional systems, allowing a better understanding of species ranges at habitat level....

  16. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    Science.gov (United States)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  17. High-resolution noise substitution to measure overfitting and validate resolution in 3D structure determination by single particle electron cryomicroscopy.

    Science.gov (United States)

    Chen, Shaoxia; McMullan, Greg; Faruqi, Abdul R; Murshudov, Garib N; Short, Judith M; Scheres, Sjors H W; Henderson, Richard

    2013-12-01

    Three-dimensional (3D) structure determination by single particle electron cryomicroscopy (cryoEM) involves the calculation of an initial 3D model, followed by extensive iterative improvement of the orientation determination of the individual particle images and the resulting 3D map. Because there is much more noise than signal at high resolution in the images, this creates the possibility of noise reinforcement in the 3D map, which can give a false impression of the resolution attained. The balance between signal and noise in the final map at its limiting resolution depends on the image processing procedure and is not easily predicted. There is a growing awareness in the cryoEM community of how to avoid such over-fitting and over-estimation of resolution. Equally, there has been a reluctance to use the two principal methods of avoidance because they give lower resolution estimates, which some people believe are too pessimistic. Here we describe a simple test that is compatible with any image processing protocol. The test allows measurement of the amount of signal and the amount of noise from overfitting that is present in the final 3D map. We have applied the method to two different sets of cryoEM images of the enzyme beta-galactosidase using several image processing packages. Our procedure involves substituting the Fourier components of the initial particle image stack beyond a chosen resolution by either the Fourier components from an adjacent area of background, or by simple randomisation of the phases of the particle structure factors. This substituted noise thus has the same spectral power distribution as the original data. Comparison of the Fourier Shell Correlation (FSC) plots from the 3D map obtained using the experimental data with that from the same data with high-resolution noise (HR-noise) substituted allows an unambiguous measurement of the amount of overfitting and an accompanying resolution assessment. A simple formula can be used to calculate an

  18. Enhancing interferometer phase estimation, sensing sensitivity, and resolution using robust entangled states

    Science.gov (United States)

    Smith, James F.

    2017-11-01

    With the goal of designing interferometers and interferometer sensors, e.g., LADARs with enhanced sensitivity, resolution, and phase estimation, states using quantum entanglement are discussed. These states include N00N states, plain M and M states (PMMSs), and linear combinations of M and M states (LCMMS). Closed form expressions for the optimal detection operators; visibility, a measure of the state's robustness to loss and noise; a resolution measure; and phase estimate error, are provided in closed form. The optimal resolution for the maximum visibility and minimum phase error are found. For the visibility, comparisons between PMMSs, LCMMS, and N00N states are provided. For the minimum phase error, comparisons between LCMMS, PMMSs, N00N states, separate photon states (SPSs), the shot noise limit (SNL), and the Heisenberg limit (HL) are provided. A representative collection of computational results illustrating the superiority of LCMMS when compared to PMMSs and N00N states is given. It is found that for a resolution 12 times the classical result LCMMS has visibility 11 times that of N00N states and 4 times that of PMMSs. For the same case, the minimum phase error for LCMMS is 10.7 times smaller than that of PMMS and 29.7 times smaller than that of N00N states.

  19. High Resolution Satellite Remote Sensing of the 2013-2014 Eruption of Sinabung Volcano, Sumatra, Indonesia

    Science.gov (United States)

    Wessels, R. L.; Griswold, J. P.

    2014-12-01

    Satellite remote sensing provided timely observations of the volcanic unrest and several months-long eruption at Sinabung Volcano, Indonesia. Visible to thermal optical and synthetic aperture radar (SAR) systems provided frequent observations of Sinabung. High resolution image data with spatial resolutions from 0.5 to 1.5m offered detailed measurements of early summit deformation and subsequent lava dome and lava flow extrusion. The high resolution data were captured by commercial satellites such as WorldView-1 and -2 visible to near-infrared (VNIR) sensors and by CosmoSkyMed, Radarsat-2, and TerraSar-X SAR systems. Less frequent 90 to 100m spatial resolution night time thermal infrared (TIR) observations were provided by ASTER and Landsat-8. The combination of data from multiple sensors allowed us to construct a more complete timeline of volcanic activity than was available via only ground-based observations. This satellite observation timeline documents estimates of lava volume and effusion rates and major explosive and lava collapse events. Frequent, repeat volume estimates suggest at least three high effusion rate pulses of up to 20 m3/s occurred during the first three months of lava effusion with an average effusion rate of 6m3/s from January 2014 to August 2014. Many of these rates and events show some correlation to variations in the Real-time Seismic-Amplitude Measurement (RSAM) documented by the Indonesian Center for Volcanology and Geologic Hazard Mitigation (CVGHM).

  20. Resolution of a High Performance Cavity Beam Position Monitor System

    International Nuclear Information System (INIS)

    Walston, S.; Chung, C.; Fitsos, P.; Gronberg, J.; Ross, M.; Khainovski, O.; Kolomensky, Y.; Loscutoff, P.; Slater, M.; Thomson, M.; Ward, D.; Boogert, S.; Vogel, V.; Meller, R.; Lyapin, A.; Malton, S.; Miller, D.; Frisch, J.; Hinton, S.; May, J.; McCormick, D.; Smith, S.; Smith, T.; White, G.; Orimoto, T.; Hayano, H.; Honda, Y.; Terunuma, N.; Urakawa, J.

    2005-01-01

    International Linear Collider (ILC) interaction region beam sizes and component position stability requirements will be as small as a few nanometers. It is important to the ILC design effort to demonstrate that these tolerances can be achieved - ideally using beam-based stability measurements. It has been estimated that RF cavity beam position monitors (BPMs) could provide position measurement resolutions of less than one nanometer and could form the basis of the desired beam-based stability measurement. We have developed a high resolution RF cavity BPM system. A triplet of these BPMs has been installed in the extraction line of the KEK Accelerator Test Facility (ATF) for testing with its ultra-low emittance beam. A metrology system for the three BPMs was recently installed. This system employed optical encoders to measure each BPM's position and orientation relative to a zero-coefficient of thermal expansion carbon fiber frame and has demonstrated that the three BPMs behave as a rigid-body to less than 5 nm. To date, we have demonstrated a BPM resolution of less than 20 nm over a dynamic range of +/- 20 microns

  1. Geostatistics: a common link between medical geography, mathematical geology, and medical geology.

    Science.gov (United States)

    Goovaerts, P

    2014-08-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviours, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentration across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level.

  2. Geostatistical prediction of microbial water quality throughout a stream network using meteorology, land cover, and spatiotemporal autocorrelation.

    Science.gov (United States)

    Holcomb, David Andrew; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R

    2018-06-11

    Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modelled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was >90%, 10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.

  3. Definition of radon prone areas in Friuli Venezia Giulia region, Italy, using geostatistical tools.

    Science.gov (United States)

    Cafaro, C; Bossew, P; Giovani, C; Garavaglia, M

    2014-12-01

    Studying the geographical distribution of indoor radon concentration, using geostatistical interpolation methods, has become common for predicting and estimating the risk to the population. Here we analyse the case of Friuli Venezia Giulia (FVG), the north easternmost region of Italy. Mean value and standard deviation are, respectively, 153 Bq/m(3) and 183 Bq/m(3). The geometric mean value is 100 Bq/m(3). Spatial datasets of indoor radon concentrations are usually affected by clustering and apparent non-stationarity issues, which can eventually yield arguable results. The clustering of the present dataset seems to be non preferential. Therefore the areal estimations are not expected to be affected. Conversely, nothing can be said on the non stationarity issues and its effects. After discussing the correlation of geology with indoor radon concentration It appears they are created by the same geologic features influencing the mean and median values, and can't be eliminated via a map-based approach. To tackle these problems, in this work we deal with multiple definitions of RPA, but only in quaternary areas of FVG, using extensive simulation techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Short Communication. Using high resolution UAV imagery to estimate tree variables in Pinus pinea plantation in Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Guerra Hernandez, J.; Gonzalez-Ferreiro, E.; Sarmento, A.; Silva, J.; Nunes, A.; Correia, A.C.; Fontes, L.; Tomé, M.; Diaz-Varela, D.

    2016-07-01

    Aim of the study: The study aims to analyse the potential use of low‑cost unmanned aerial vehicle (UAV) imagery for the estimation of Pinus pinea L. variables at the individual tree level (position, tree height and crown diameter). Area of study: This study was conducted under the PINEA project focused on 16 ha of umbrella pine afforestation (Portugal) subjected to different treatments. Material and methods: The workflow involved: a) image acquisition with consumer‑grade cameras on board an UAV; b) orthomosaic and digital surface model (DSM) generation using structure-from-motion (SfM) image reconstruction; and c) automatic individual tree segmentation by using a mixed pixel‑ and region‑based based algorithm. Main results: The results of individual tree segmentation (position, height and crown diameter) were validated using field measurements from 3 inventory plots in the study area. All the trees of the plots were correctly detected. The RMSE values for the predicted heights and crown widths were 0.45 m and 0.63 m, respectively. Research highlights: The results demonstrate that tree variables can be automatically extracted from high resolution imagery. We highlight the use of UAV systems as a fast, reliable and cost‑effective technique for small scale applications. (Author)

  5. Short Communication. Using high resolution UAV imagery to estimate tree variables in Pinus pinea plantation in Portugal

    International Nuclear Information System (INIS)

    Guerra Hernandez, J.; Gonzalez-Ferreiro, E.; Sarmento, A.; Silva, J.; Nunes, A.; Correia, A.C.; Fontes, L.; Tomé, M.; Diaz-Varela, D.

    2016-01-01

    Aim of the study: The study aims to analyse the potential use of low‑cost unmanned aerial vehicle (UAV) imagery for the estimation of Pinus pinea L. variables at the individual tree level (position, tree height and crown diameter). Area of study: This study was conducted under the PINEA project focused on 16 ha of umbrella pine afforestation (Portugal) subjected to different treatments. Material and methods: The workflow involved: a) image acquisition with consumer‑grade cameras on board an UAV; b) orthomosaic and digital surface model (DSM) generation using structure-from-motion (SfM) image reconstruction; and c) automatic individual tree segmentation by using a mixed pixel‑ and region‑based based algorithm. Main results: The results of individual tree segmentation (position, height and crown diameter) were validated using field measurements from 3 inventory plots in the study area. All the trees of the plots were correctly detected. The RMSE values for the predicted heights and crown widths were 0.45 m and 0.63 m, respectively. Research highlights: The results demonstrate that tree variables can be automatically extracted from high resolution imagery. We highlight the use of UAV systems as a fast, reliable and cost‑effective technique for small scale applications. (Author)

  6. Age estimates for the late quaternary high sea-stands

    Science.gov (United States)

    Smart, Peter L.; Richards, David A.

    A database of more than 300 published alpha-counted uranium-series ages has been compiled for coral reef terraces formed by Late Pleistocene high sea-stands. The database was screened to eliminate unreliable age estimates ( {230Th }/{232Th } 5%) and those without quoted without quoted errors, and a distributed error frequency curve was produced. This curve can be considered as a finite mixture model comprising k component normal distributions each with a weighting α. By using an expectation maximising algorithm, the mean and standard deviation of the component distributions, each corresponding to a high sea level event, were estimated. Eight high sea-stands with mean and standard deviations of 129.0 ± 33.0, 123.0 ± 13.0, 102.5 ± 2.0, 81.5 ± 5.0, 61.5 ± 6.0, 50.0 ± 1.0, 40.5 ± 5.0 and 33.0 ± 2.5 ka were resolved. The standard deviations are generally larger than the values quoted for individual age estimates. Whilst this may be due to diagenetic effects, especially for the older corals, it is argued that in many cases geological evidence clearly indicates that the high stands are multiple events, often not resolvable at sites with low rates of uplift. The uranium-series dated coral-reef terrace chronology shows good agreement with independent chronologies derived for Antarctic ice cores, although the resolution for the latter is better. Agreement with orbitally-tuned deep-sea core records is also good, but it is argued that Isotope Stage 5e is not a single event, as recorded in the cores, but a multiple event spanning some 12 ka. The much earlier age for Isotope Stage 5e given by Winograd et al. (1988) is not supported by the coral reef data, but further mass-spectrometric uranium-series dating is needed to permit better chronological resolution.

  7. Optimal design of sampling and mapping schemes in the radiometric exploration of Chipilapa, El Salvador (Geo-statistics)

    International Nuclear Information System (INIS)

    Balcazar G, M.; Flores R, J.H.

    1992-01-01

    As part of the knowledge about the radiometric surface exploration, carried out in the geothermal field of Chipilapa, El Salvador, its were considered the geo-statistical parameters starting from the calculated variogram of the field data, being that the maxim distance of correlation of the samples in 'radon' in the different observation addresses (N-S, E-W, N W-S E, N E-S W), it was of 121 mts for the monitoring grill in future prospectus in the same area. Being derived of it an optimization (minimum cost) in the spacing of the field samples by means of geo-statistical techniques, without losing the detection of the anomaly. (Author)

  8. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    Science.gov (United States)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  9. Calculation of the time resolution of the J-PET tomograph using kernel density estimation

    Science.gov (United States)

    Raczyński, L.; Wiślicki, W.; Krzemień, W.; Kowalski, P.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B.; Jasińska, B.; Kamińska, D.; Korcyl, G.; Kozik, T.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Pawlik-Niedźwiecka, M.; Niedźwiecki, S.; Pałka, M.; Rudy, Z.; Rundel, O.; Sharma, N. G.; Silarski, M.; Smyrski, J.; Strzelecki, A.; Wieczorek, A.; Zgardzińska, B.; Zieliński, M.; Moskal, P.

    2017-06-01

    In this paper we estimate the time resolution of the J-PET scanner built from plastic scintillators. We incorporate the method of signal processing using the Tikhonov regularization framework and the kernel density estimation method. We obtain simple, closed-form analytical formulae for time resolution. The proposed method is validated using signals registered by means of the single detection unit of the J-PET tomograph built from a 30 cm long plastic scintillator strip. It is shown that the experimental and theoretical results obtained for the J-PET scanner equipped with vacuum tube photomultipliers are consistent.

  10. High Resolution Depth-Resolved Imaging From Multi-Focal Images for Medical Ultrasound

    DEFF Research Database (Denmark)

    Diamantis, Konstantinos; Dalgarno, Paul A.; Greenaway, Alan H.

    2015-01-01

    An ultrasound imaging technique providing subdiffraction limit axial resolution for point sources is proposed. It is based on simultaneously acquired multi-focal images of the same object, and on the image metric of sharpness. The sharpness is extracted by image data and presents higher values...... calibration curves combined with the use of a maximum-likelihood algorithm is then able to estimate, with high precision, the depth location of any emitter fron each single image. Estimated values are compared with the ground truth demonstrating that an accuracy of 28.6 µm (0.13λ) is achieved for a 4 mm depth...

  11. High-resolution regional climate model evaluation using variable-resolution CESM over California

    Science.gov (United States)

    Huang, X.; Rhoades, A.; Ullrich, P. A.; Zarzycki, C. M.

    2015-12-01

    Understanding the effect of climate change at regional scales remains a topic of intensive research. Though computational constraints remain a problem, high horizontal resolution is needed to represent topographic forcing, which is a significant driver of local climate variability. Although regional climate models (RCMs) have traditionally been used at these scales, variable-resolution global climate models (VRGCMs) have recently arisen as an alternative for studying regional weather and climate allowing two-way interaction between these domains without the need for nudging. In this study, the recently developed variable-resolution option within the Community Earth System Model (CESM) is assessed for long-term regional climate modeling over California. Our variable-resolution simulations will focus on relatively high resolutions for climate assessment, namely 28km and 14km regional resolution, which are much more typical for dynamically downscaled studies. For comparison with the more widely used RCM method, the Weather Research and Forecasting (WRF) model will be used for simulations at 27km and 9km. All simulations use the AMIP (Atmospheric Model Intercomparison Project) protocols. The time period is from 1979-01-01 to 2005-12-31 (UTC), and year 1979 was discarded as spin up time. The mean climatology across California's diverse climate zones, including temperature and precipitation, is analyzed and contrasted with the Weather Research and Forcasting (WRF) model (as a traditional RCM), regional reanalysis, gridded observational datasets and uniform high-resolution CESM at 0.25 degree with the finite volume (FV) dynamical core. The results show that variable-resolution CESM is competitive in representing regional climatology on both annual and seasonal time scales. This assessment adds value to the use of VRGCMs for projecting climate change over the coming century and improve our understanding of both past and future regional climate related to fine

  12. Section on High Resolution Optical Imaging (HROI)

    Data.gov (United States)

    Federal Laboratory Consortium — The Section on High Resolution Optical Imaging (HROI) develops novel technologies for studying biological processes at unprecedented speed and resolution. Research...

  13. High resolution SAW elastography for ex-vivo porcine skin specimen

    Science.gov (United States)

    Zhou, Kanheng; Feng, Kairui; Wang, Mingkai; Jamera, Tanatswa; Li, Chunhui; Huang, Zhihong

    2018-02-01

    Surface acoustic wave (SAW) elastography has been proven to be a non-invasive, non-destructive method for accurately characterizing tissue elastic properties. Current SAW elastography technique tracks generated surface acoustic wave impulse point by point which are a few millimeters away. Thus, reconstructed elastography has low lateral resolution. To improve the lateral resolution of current SAW elastography, a new method was proposed in this research. A M-B scan mode, high spatial resolution phase sensitive optical coherence tomography (PhS-OCT) system was employed to track the ultrasonically induced SAW impulse. Ex-vivo porcine skin specimen was tested using this proposed method. A 2D fast Fourier transform based algorithm was applied to process the acquired data for estimating the surface acoustic wave dispersion curve and its corresponding penetration depth. Then, the ex-vivo porcine skin elastogram was established by relating the surface acoustic wave dispersion curve and its corresponding penetration depth. The result from the proposed method shows higher lateral resolution than that from current SAW elastography technique, and the approximated skin elastogram could also distinguish the different layers in the skin specimen, i.e. epidermis, dermis and fat layer. This proposed SAW elastography technique may have a large potential to be widely applied in clinical use for skin disease diagnosis and treatment monitoring.

  14. High resolution remote sensing for reducing uncertainties in urban forest carbon offset life cycle assessments.

    Science.gov (United States)

    Tigges, Jan; Lakes, Tobia

    2017-10-04

    Urban forests reduce greenhouse gas emissions by storing and sequestering considerable amounts of carbon. However, few studies have considered the local scale of urban forests to effectively evaluate their potential long-term carbon offset. The lack of precise, consistent and up-to-date forest details is challenging for long-term prognoses. Therefore, this review aims to identify uncertainties in urban forest carbon offset assessment and discuss the extent to which such uncertainties can be reduced by recent progress in high resolution remote sensing. We do this by performing an extensive literature review and a case study combining remote sensing and life cycle assessment of urban forest carbon offset in Berlin, Germany. Recent progress in high resolution remote sensing and methods is adequate for delivering more precise details on the urban tree canopy, individual tree metrics, species, and age structures compared to conventional land use/cover class approaches. These area-wide consistent details can update life cycle inventories for more precise future prognoses. Additional improvements in classification accuracy can be achieved by a higher number of features derived from remote sensing data of increasing resolution, but first studies on this subject indicated that a smart selection of features already provides sufficient data that avoids redundancies and enables more efficient data processing. Our case study from Berlin could use remotely sensed individual tree species as consistent inventory of a life cycle assessment. However, a lack of growth, mortality and planting data forced us to make assumptions, therefore creating uncertainty in the long-term prognoses. Regarding temporal changes and reliable long-term estimates, more attention is required to detect changes of gradual growth, pruning and abrupt changes in tree planting and mortality. As such, precise long-term urban ecological monitoring using high resolution remote sensing should be intensified

  15. Seismic forecast using geostatistics

    International Nuclear Information System (INIS)

    Grecu, Valeriu; Mateiciuc, Doru

    2007-01-01

    The main idea of this research direction consists in the special way of constructing a new type of mathematical function as being a correlation between a computed statistical quantity and another physical quantity. This type of function called 'position function' was taken over by the authors of this study in the field of seismology with the hope of solving - at least partially - the difficult problem of seismic forecast. The geostatistic method of analysis focuses on the process of energy accumulation in a given seismic area, completing this analysis by a so-called loading function. This function - in fact a temporal function - describes the process of energy accumulation during a seismic cycle from a given seismic area. It was possible to discover a law of evolution of the seismic cycles that was materialized in a so-called characteristic function. This special function will help us to forecast the magnitude and the occurrence moment of the largest earthquake in the analysed area. Since 2000, the authors have been evolving to a new stage of testing: real - time analysis, in order to verify the quality of the method. There were five large earthquakes forecasts. (authors)

  16. High angular resolution at LBT

    Science.gov (United States)

    Conrad, A.; Arcidiacono, C.; Bertero, M.; Boccacci, P.; Davies, A. G.; Defrere, D.; de Kleer, K.; De Pater, I.; Hinz, P.; Hofmann, K. H.; La Camera, A.; Leisenring, J.; Kürster, M.; Rathbun, J. A.; Schertl, D.; Skemer, A.; Skrutskie, M.; Spencer, J. R.; Veillet, C.; Weigelt, G.; Woodward, C. E.

    2015-12-01

    High angular resolution from ground-based observatories stands as a key technology for advancing planetary science. In the window between the angular resolution achievable with 8-10 meter class telescopes, and the 23-to-40 meter giants of the future, LBT provides a glimpse of what the next generation of instruments providing higher angular resolution will provide. We present first ever resolved images of an Io eruption site taken from the ground, images of Io's Loki Patera taken with Fizeau imaging at the 22.8 meter LBT [Conrad, et al., AJ, 2015]. We will also present preliminary analysis of two data sets acquired during the 2015 opposition: L-band fringes at Kurdalagon and an occultation of Loki and Pele by Europa (see figure). The light curves from this occultation will yield an order of magnitude improvement in spatial resolution along the path of ingress and egress. We will conclude by providing an overview of the overall benefit of recent and future advances in angular resolution for planetary science.

  17. High-resolution transmission electron microscopy and energetics of flattened carbon nonoshells

    International Nuclear Information System (INIS)

    Bourgeois, L.N.; Bursill, L.A.

    1998-01-01

    When examined under a high-resolution transmission electron microscope, carbon soot produced alongside buckytubes in an arc-discharge is found to contain a small percentage of flattened carbon shells. These objects are shown to be small graphite flakes which eliminated their dangling bonds by terminating their edges with highly curved junctions. Ideal models for these structures are presented, and their energy estimated. The calculations show that the establishment of highly curved junctions is energetically favourable for a graphite flake in an inert atmosphere. Flattened shells also appear more stable than their 'inflated' counterparts (fullerene 'onions' and buckytubes) when the shell dimensions obey specific criteria.(authors)

  18. A method for generating high resolution satellite image time series

    Science.gov (United States)

    Guo, Tao

    2014-10-01

    There is an increasing demand for satellite remote sensing data with both high spatial and temporal resolution in many applications. But it still is a challenge to simultaneously improve spatial resolution and temporal frequency due to the technical limits of current satellite observation systems. To this end, much R&D efforts have been ongoing for years and lead to some successes roughly in two aspects, one includes super resolution, pan-sharpen etc. methods which can effectively enhance the spatial resolution and generate good visual effects, but hardly preserve spectral signatures and result in inadequate analytical value, on the other hand, time interpolation is a straight forward method to increase temporal frequency, however it increase little informative contents in fact. In this paper we presented a novel method to simulate high resolution time series data by combing low resolution time series data and a very small number of high resolution data only. Our method starts with a pair of high and low resolution data set, and then a spatial registration is done by introducing LDA model to map high and low resolution pixels correspondingly. Afterwards, temporal change information is captured through a comparison of low resolution time series data, and then projected onto the high resolution data plane and assigned to each high resolution pixel according to the predefined temporal change patterns of each type of ground objects. Finally the simulated high resolution data is generated. A preliminary experiment shows that our method can simulate a high resolution data with a reasonable accuracy. The contribution of our method is to enable timely monitoring of temporal changes through analysis of time sequence of low resolution images only, and usage of costly high resolution data can be reduces as much as possible, and it presents a highly effective way to build up an economically operational monitoring solution for agriculture, forest, land use investigation

  19. High-resolution Monthly Satellite Precipitation Product over the Conterminous United States

    Science.gov (United States)

    Hashemi, H.; Fayne, J.; Knight, R. J.; Lakshmi, V.

    2017-12-01

    We present a data set that enhanced the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) monthly product 3B43 in its accuracy and spatial resolution. For this, we developed a correction function to improve the accuracy of TRMM 3B43, spatial resolution of 25 km, by estimating and removing the bias in the satellite data using a ground-based precipitation data set. We observed a strong relationship between the bias and land surface elevation; TRMM 3B43 tends to underestimate the ground-based product at elevations above 1500 m above mean sea level (m.amsl) over the conterminous United States. A relationship was developed between satellite bias and elevation. We then resampled TRMM 3B43 to the Digital Elevation Model (DEM) data set at a spatial resolution of 30 arc second ( 1 km on the ground). The produced high-resolution satellite-based data set was corrected using the developed correction function based on the bias-elevation relationship. Assuming that each rain gauge represents an area of 1 km2, we verified our product against 9,200 rain gauges across the conterminous United States. The new product was compared with the gauges, which have 50, 60, 70, 80, 90, and 100% temporal coverage within the TRMM period of 1998 to 2015. Comparisons between the high-resolution corrected satellite-based data and gauges showed an excellent agreement. The new product captured more detail in the changes in precipitation over the mountainous region than the original TRMM 3B43.

  20. 2D Unitary ESPRIT Based Super-Resolution Channel Estimation for Millimeter-Wave Massive MIMO with Hybrid Precoding

    KAUST Repository

    Liao, Anwen

    2017-11-01

    Millimeter-wave (mmWave) massive multiple-input multiple-output (MIMO) with hybrid precoding is a promising technique for the future 5G wireless communications. Due to a large number of antennas but a much smaller number of radio frequency (RF) chains, estimating the high-dimensional mmWave massive MIMO channel will bring the large pilot overhead. To overcome this challenge, this paper proposes a super-resolution channel estimation scheme based on two-dimensional (2D) unitary ESPRIT algorithm. By exploiting the angular sparsity of mmWave channels, the continuously distributed angle of arrivals/departures (AoAs/AoDs) can be jointly estimated with high accuracy. Specifically, by designing the uplink training signals at both base station (BS) and mobile station (MS), we first use low pilot overhead to estimate a low-dimensional effective channel, which has the same shift-invariance of array response as the high-dimensional mmWave MIMO channel to be estimated. From the low-dimensional effective channel, the superresolution estimates of AoAs and AoDs can be jointly obtained by exploiting the 2D unitary ESPRIT channel estimation algorithm. Furthermore, the associated path gains can be acquired based on the least squares (LS) criterion. Finally, we can reconstruct the high-dimensional mmWave MIMO channel according to the obtained AoAs, AoDs, and path gains. Simulation results have confirmed that the proposed scheme is superior to conventional schemes with a much lower pilot overhead.

  1. 2D Unitary ESPRIT Based Super-Resolution Channel Estimation for Millimeter-Wave Massive MIMO with Hybrid Precoding

    KAUST Repository

    Liao, Anwen; Gao, Zhen; Wu, Yongpeng; Wang, Hua; Alouini, Mohamed-Slim

    2017-01-01

    Millimeter-wave (mmWave) massive multiple-input multiple-output (MIMO) with hybrid precoding is a promising technique for the future 5G wireless communications. Due to a large number of antennas but a much smaller number of radio frequency (RF) chains, estimating the high-dimensional mmWave massive MIMO channel will bring the large pilot overhead. To overcome this challenge, this paper proposes a super-resolution channel estimation scheme based on two-dimensional (2D) unitary ESPRIT algorithm. By exploiting the angular sparsity of mmWave channels, the continuously distributed angle of arrivals/departures (AoAs/AoDs) can be jointly estimated with high accuracy. Specifically, by designing the uplink training signals at both base station (BS) and mobile station (MS), we first use low pilot overhead to estimate a low-dimensional effective channel, which has the same shift-invariance of array response as the high-dimensional mmWave MIMO channel to be estimated. From the low-dimensional effective channel, the superresolution estimates of AoAs and AoDs can be jointly obtained by exploiting the 2D unitary ESPRIT channel estimation algorithm. Furthermore, the associated path gains can be acquired based on the least squares (LS) criterion. Finally, we can reconstruct the high-dimensional mmWave MIMO channel according to the obtained AoAs, AoDs, and path gains. Simulation results have confirmed that the proposed scheme is superior to conventional schemes with a much lower pilot overhead.

  2. High resolution remote sensing of water surface patterns

    Science.gov (United States)

    Woodget, A.; Visser, F.; Maddock, I.; Carbonneau, P.

    2012-12-01

    The assessment of in-stream habitat availability within fluvial environments in the UK traditionally includes the mapping of patterns which appear on the surface of the water, known as 'surface flow types' (SFTs). The UK's River Habitat Survey identifies ten key SFTs, including categories such as rippled flow, upwelling, broken standing waves and smooth flow. SFTs result from the interaction between the underlying channel morphology, water depth and velocity and reflect the local flow hydraulics. It has been shown that SFTs can be both biologically and hydraulically distinct. SFT mapping is usually conducted from the river banks where estimates of spatial coverage are made by eye. This approach is affected by user subjectivity and inaccuracies in the spatial extent of mapped units. Remote sensing and specifically the recent developments in unmanned aerial systems (UAS) may now offer an alternative approach for SFT mapping, with the capability for rapid and repeatable collection of very high resolution imagery from low altitudes, under bespoke flight conditions. This PhD research is aimed at investigating the mapping of SFTs using high resolution optical imagery (less than 10cm) collected from a helicopter-based UAS flown at low altitudes (less than 100m). This paper presents the initial findings from a series of structured experiments on the River Arrow, a small lowland river in Warwickshire, UK. These experiments investigate the potential for mapping SFTs from still and video imagery of different spatial resolutions collected at different flying altitudes and from different viewing angles (i.e. vertical and oblique). Imagery is processed using 3D mosaicking software to create orthophotos and digital elevation models (DEM). The types of image analysis which are tested include a simple, manual visual assessment undertaken in a GIS environment, based on the high resolution optical imagery. In addition, an object-based image analysis approach which makes use of the

  3. High resolution Cerenkov light imaging of induced positron distribution in proton therapy

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi, E-mail: s-yama@met.nagoya-u.ac.jp; Fujii, Kento; Morishita, Yuki; Okumura, Satoshi; Komori, Masataka [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine, Aichi 461-8673 (Japan); Toshito, Toshiyuki [Department of Proton Therapy Physics, Nagoya Proton Therapy Center, Nagoya City West Medical Center, Aichi 462-8508 (Japan)

    2014-11-01

    Purpose: In proton therapy, imaging of the positron distribution produced by fragmentation during or soon after proton irradiation is a useful method to monitor the proton range. Although positron emission tomography (PET) is typically used for this imaging, its spatial resolution is limited. Cerenkov light imaging is a new molecular imaging technology that detects the visible photons that are produced from high-speed electrons using a high sensitivity optical camera. Because its inherent spatial resolution is much higher than PET, the authors can measure more precise information of the proton-induced positron distribution with Cerenkov light imaging technology. For this purpose, they conducted Cerenkov light imaging of induced positron distribution in proton therapy. Methods: First, the authors evaluated the spatial resolution of our Cerenkov light imaging system with a {sup 22}Na point source for the actual imaging setup. Then the transparent acrylic phantoms (100 × 100 × 100 mm{sup 3}) were irradiated with two different proton energies using a spot scanning proton therapy system. Cerenkov light imaging of each phantom was conducted using a high sensitivity electron multiplied charge coupled device (EM-CCD) camera. Results: The Cerenkov light’s spatial resolution for the setup was 0.76 ± 0.6 mm FWHM. They obtained high resolution Cerenkov light images of the positron distributions in the phantoms for two different proton energies and made fused images of the reference images and the Cerenkov light images. The depths of the positron distribution in the phantoms from the Cerenkov light images were almost identical to the simulation results. The decay curves derived from the region-of-interests (ROIs) set on the Cerenkov light images revealed that Cerenkov light images can be used for estimating the half-life of the radionuclide components of positrons. Conclusions: High resolution Cerenkov light imaging of proton-induced positron distribution was possible. The

  4. High resolution Cerenkov light imaging of induced positron distribution in proton therapy

    International Nuclear Information System (INIS)

    Yamamoto, Seiichi; Fujii, Kento; Morishita, Yuki; Okumura, Satoshi; Komori, Masataka; Toshito, Toshiyuki

    2014-01-01

    Purpose: In proton therapy, imaging of the positron distribution produced by fragmentation during or soon after proton irradiation is a useful method to monitor the proton range. Although positron emission tomography (PET) is typically used for this imaging, its spatial resolution is limited. Cerenkov light imaging is a new molecular imaging technology that detects the visible photons that are produced from high-speed electrons using a high sensitivity optical camera. Because its inherent spatial resolution is much higher than PET, the authors can measure more precise information of the proton-induced positron distribution with Cerenkov light imaging technology. For this purpose, they conducted Cerenkov light imaging of induced positron distribution in proton therapy. Methods: First, the authors evaluated the spatial resolution of our Cerenkov light imaging system with a 22 Na point source for the actual imaging setup. Then the transparent acrylic phantoms (100 × 100 × 100 mm 3 ) were irradiated with two different proton energies using a spot scanning proton therapy system. Cerenkov light imaging of each phantom was conducted using a high sensitivity electron multiplied charge coupled device (EM-CCD) camera. Results: The Cerenkov light’s spatial resolution for the setup was 0.76 ± 0.6 mm FWHM. They obtained high resolution Cerenkov light images of the positron distributions in the phantoms for two different proton energies and made fused images of the reference images and the Cerenkov light images. The depths of the positron distribution in the phantoms from the Cerenkov light images were almost identical to the simulation results. The decay curves derived from the region-of-interests (ROIs) set on the Cerenkov light images revealed that Cerenkov light images can be used for estimating the half-life of the radionuclide components of positrons. Conclusions: High resolution Cerenkov light imaging of proton-induced positron distribution was possible. The authors

  5. Development of temperature profile sensor at high temporal and spatial resolution

    International Nuclear Information System (INIS)

    Takiguchi, Hiroki; Furuya, Masahiro; Arai, Takahiro

    2017-01-01

    In order to quantify thermo-physical flow field for the industrial applications such as nuclear and chemical reactors, high temporal and spatial measurements for temperature, pressure, phase velocity, viscosity and so on are required to validate computational fluid dynamics (CFD) and subchannel analyses. The paper proposes a novel temperature profile sensor, which can acquire temperature distribution in water at high temporal (a millisecond) and spatial (millimeter) resolutions. The devised sensor acquires electric conductance between transmitter and receiver wires, which is a function of temperature. The sensor comprise wire mesh structure for multipoint and simultaneous temperature measurement in water, which indicated that three-dimensional temperature distribution can be detected in flexible resolutions. For the demonstration of the principle, temperature profile in water was estimated according to pre-determined temperature calibration line against time-averaged impedance. The 16×16 grid sensor visualized fast and multi-dimensional mixing process of a hot water jet into a cold water pool. (author)

  6. Global-Scale Associations of Vegetation Phenology with Rainfall and Temperature at a High Spatio-Temporal Resolution

    Directory of Open Access Journals (Sweden)

    Nicholas Clinton

    2014-08-01

    Full Text Available Phenology response to climatic variables is a vital indicator for understanding changes in biosphere processes as related to possible climate change. We investigated global phenology relationships to precipitation and land surface temperature (LST at high spatial and temporal resolution for calendar years 2008–2011. We used cross-correlation between MODIS Enhanced Vegetation Index (EVI, MODIS LST and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN gridded rainfall to map phenology relationships at 1-km spatial resolution and weekly temporal resolution. We show these data to be rich in spatiotemporal information, illustrating distinct phenology patterns as a result of complex overlapping gradients of climate, ecosystem and land use/land cover. The data are consistent with broad-scale, coarse-resolution modeled ecosystem limitations to moisture, temperature and irradiance. We suggest that high-resolution phenology data are useful as both an input and complement to land use/land cover classifiers and for understanding climate change vulnerability in natural and anthropogenic landscapes.

  7. A multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration.

    Science.gov (United States)

    Goovaerts, P; Albuquerque, Teresa; Antunes, Margarida

    2016-11-01

    This paper describes a multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration, with an application to an abandoned sedimentary gold mining region in Portugal. The main challenge was the existence of only a dozen gold measurements confined to the grounds of the old gold mines, which precluded the application of traditional interpolation techniques, such as cokriging. The analysis could, however, capitalize on 376 stream sediment samples that were analyzed for twenty two elements. Gold (Au) was first predicted at all 376 locations using linear regression (R 2 =0.798) and four metals (Fe, As, Sn and W), which are known to be mostly associated with the local gold's paragenesis. One hundred realizations of the spatial distribution of gold content were generated using sequential indicator simulation and a soft indicator coding of regression estimates, to supplement the hard indicator coding of gold measurements. Each simulated map then underwent a local cluster analysis to identify significant aggregates of low or high values. The one hundred classified maps were processed to derive the most likely classification of each simulated node and the associated probability of occurrence. Examining the distribution of the hot-spots and cold-spots reveals a clear enrichment in Au along the Erges River downstream from the old sedimentary mineralization.

  8. Image-based motion compensation for high-resolution extremities cone-beam CT

    Science.gov (United States)

    Sisniega, A.; Stayman, J. W.; Cao, Q.; Yorkston, J.; Siewerdsen, J. H.; Zbijewski, W.

    2016-03-01

    Purpose: Cone-beam CT (CBCT) of the extremities provides high spatial resolution, but its quantitative accuracy may be challenged by involuntary sub-mm patient motion that cannot be eliminated with simple means of external immobilization. We investigate a two-step iterative motion compensation based on a multi-component metric of image sharpness. Methods: Motion is considered with respect to locally rigid motion within a particular region of interest, and the method supports application to multiple locally rigid regions. Motion is estimated by maximizing a cost function with three components: a gradient metric encouraging image sharpness, an entropy term that favors high contrast and penalizes streaks, and a penalty term encouraging smooth motion. Motion compensation involved initial coarse estimation of gross motion followed by estimation of fine-scale displacements using high resolution reconstructions. The method was evaluated in simulations with synthetic motion (1-4 mm) applied to a wrist volume obtained on a CMOS-based CBCT testbench. Structural similarity index (SSIM) quantified the agreement between motion-compensated and static data. The algorithm was also tested on a motion contaminated patient scan from dedicated extremities CBCT. Results: Excellent correction was achieved for the investigated range of displacements, indicated by good visual agreement with the static data. 10-15% improvement in SSIM was attained for 2-4 mm motions. The compensation was robust against increasing motion (4% decrease in SSIM across the investigated range, compared to 14% with no compensation). Consistent performance was achieved across a range of noise levels. Significant mitigation of artifacts was shown in patient data. Conclusion: The results indicate feasibility of image-based motion correction in extremities CBCT without the need for a priori motion models, external trackers, or fiducials.

  9. Resolution enhancement of low-quality videos using a high-resolution frame

    Science.gov (United States)

    Pham, Tuan Q.; van Vliet, Lucas J.; Schutte, Klamer

    2006-01-01

    This paper proposes an example-based Super-Resolution (SR) algorithm of compressed videos in the Discrete Cosine Transform (DCT) domain. Input to the system is a Low-Resolution (LR) compressed video together with a High-Resolution (HR) still image of similar content. Using a training set of corresponding LR-HR pairs of image patches from the HR still image, high-frequency details are transferred from the HR source to the LR video. The DCT-domain algorithm is much faster than example-based SR in spatial domain 6 because of a reduction in search dimensionality, which is a direct result of the compact and uncorrelated DCT representation. Fast searching techniques like tree-structure vector quantization 16 and coherence search1 are also key to the improved efficiency. Preliminary results on MJPEG sequence show promising result of the DCT-domain SR synthesis approach.

  10. A cloud mask methodology for high resolution remote sensing data combining information from high and medium resolution optical sensors

    Science.gov (United States)

    Sedano, Fernando; Kempeneers, Pieter; Strobl, Peter; Kucera, Jan; Vogt, Peter; Seebach, Lucia; San-Miguel-Ayanz, Jesús

    2011-09-01

    This study presents a novel cloud masking approach for high resolution remote sensing images in the context of land cover mapping. As an advantage to traditional methods, the approach does not rely on thermal bands and it is applicable to images from most high resolution earth observation remote sensing sensors. The methodology couples pixel-based seed identification and object-based region growing. The seed identification stage relies on pixel value comparison between high resolution images and cloud free composites at lower spatial resolution from almost simultaneously acquired dates. The methodology was tested taking SPOT4-HRVIR, SPOT5-HRG and IRS-LISS III as high resolution images and cloud free MODIS composites as reference images. The selected scenes included a wide range of cloud types and surface features. The resulting cloud masks were evaluated through visual comparison. They were also compared with ad-hoc independently generated cloud masks and with the automatic cloud cover assessment algorithm (ACCA). In general the results showed an agreement in detected clouds higher than 95% for clouds larger than 50 ha. The approach produced consistent results identifying and mapping clouds of different type and size over various land surfaces including natural vegetation, agriculture land, built-up areas, water bodies and snow.

  11. Parameterization of water vapor using high-resolution GPS data and empirical models

    Science.gov (United States)

    Ningombam, Shantikumar S.; Jade, Sridevi; Shrungeshwara, T. S.

    2018-03-01

    The present work evaluates eleven existing empirical models to estimate Precipitable Water Vapor (PWV) over a high-altitude (4500 m amsl), cold-desert environment. These models are tested extensively and used globally to estimate PWV for low altitude sites (below 1000 m amsl). The moist parameters used in the model are: water vapor scale height (Hc), dew point temperature (Td) and water vapor pressure (Es 0). These moist parameters are derived from surface air temperature and relative humidity measured at high temporal resolution from automated weather station. The performance of these models are examined statistically with observed high-resolution GPS (GPSPWV) data over the region (2005-2012). The correlation coefficient (R) between the observed GPSPWV and Model PWV is 0.98 at daily data and varies diurnally from 0.93 to 0.97. Parameterization of moisture parameters were studied in-depth (i.e., 2 h to monthly time scales) using GPSPWV , Td , and Es 0 . The slope of the linear relationships between GPSPWV and Td varies from 0.073°C-1 to 0.106°C-1 (R: 0.83 to 0.97) while GPSPWV and Es 0 varied from 1.688 to 2.209 (R: 0.95 to 0.99) at daily, monthly and diurnal time scales. In addition, the moist parameters for the cold desert, high-altitude environment are examined in-depth at various time scales during 2005-2012.

  12. High-Resolution Mass Spectrometers

    Science.gov (United States)

    Marshall, Alan G.; Hendrickson, Christopher L.

    2008-07-01

    Over the past decade, mass spectrometry has been revolutionized by access to instruments of increasingly high mass-resolving power. For small molecules up to ˜400 Da (e.g., drugs, metabolites, and various natural organic mixtures ranging from foods to petroleum), it is possible to determine elemental compositions (CcHhNnOoSsPp…) of thousands of chemical components simultaneously from accurate mass measurements (the same can be done up to 1000 Da if additional information is included). At higher mass, it becomes possible to identify proteins (including posttranslational modifications) from proteolytic peptides, as well as lipids, glycoconjugates, and other biological components. At even higher mass (˜100,000 Da or higher), it is possible to characterize posttranslational modifications of intact proteins and to map the binding surfaces of large biomolecule complexes. Here we review the principles and techniques of the highest-resolution analytical mass spectrometers (time-of-flight and Fourier transform ion cyclotron resonance and orbitrap mass analyzers) and describe some representative high-resolution applications.

  13. Sediment delivery estimates in water quality models altered by resolution and source of topographic data.

    Science.gov (United States)

    Beeson, Peter C; Sadeghi, Ali M; Lang, Megan W; Tomer, Mark D; Daughtry, Craig S T

    2014-01-01

    Moderate-resolution (30-m) digital elevation models (DEMs) are normally used to estimate slope for the parameterization of non-point source, process-based water quality models. These models, such as the Soil and Water Assessment Tool (SWAT), use the Universal Soil Loss Equation (USLE) and Modified USLE to estimate sediment loss. The slope length and steepness factor, a critical parameter in USLE, significantly affects sediment loss estimates. Depending on slope range, a twofold difference in slope estimation potentially results in as little as 50% change or as much as 250% change in the LS factor and subsequent sediment estimation. Recently, the availability of much finer-resolution (∼3 m) DEMs derived from Light Detection and Ranging (LiDAR) data has increased. However, the use of these data may not always be appropriate because slope values derived from fine spatial resolution DEMs are usually significantly higher than slopes derived from coarser DEMs. This increased slope results in considerable variability in modeled sediment output. This paper addresses the implications of parameterizing models using slope values calculated from DEMs with different spatial resolutions (90, 30, 10, and 3 m) and sources. Overall, we observed over a 2.5-fold increase in slope when using a 3-m instead of a 90-m DEM, which increased modeled soil loss using the USLE calculation by 130%. Care should be taken when using LiDAR-derived DEMs to parameterize water quality models because doing so can result in significantly higher slopes, which considerably alter modeled sediment loss. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  14. USGS High Resolution Orthoimagery Collection - Historical - National Geospatial Data Asset (NGDA) High Resolution Orthoimagery

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS high resolution orthorectified images from The National Map combine the image characteristics of an aerial photograph with the geometric qualities of a map. An...

  15. High throughput screening of ligand binding to macromolecules using high resolution powder diffraction

    Science.gov (United States)

    Von Dreele, Robert B.; D'Amico, Kevin

    2006-10-31

    A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.

  16. Bayesian approach to peak deconvolution and library search for high resolution gas chromatography - Mass spectrometry.

    Science.gov (United States)

    Barcaru, A; Mol, H G J; Tienstra, M; Vivó-Truyols, G

    2017-08-29

    A novel probabilistic Bayesian strategy is proposed to resolve highly coeluting peaks in high-resolution GC-MS (Orbitrap) data. Opposed to a deterministic approach, we propose to solve the problem probabilistically, using a complete pipeline. First, the retention time(s) for a (probabilistic) number of compounds for each mass channel are estimated. The statistical dependency between m/z channels was implied by including penalties in the model objective function. Second, Bayesian Information Criterion (BIC) is used as Occam's razor for the probabilistic assessment of the number of components. Third, a probabilistic set of resolved spectra, and their associated retention times are estimated. Finally, a probabilistic library search is proposed, computing the spectral match with a high resolution library. More specifically, a correlative measure was used that included the uncertainties in the least square fitting, as well as the probability for different proposals for the number of compounds in the mixture. The method was tested on simulated high resolution data, as well as on a set of pesticides injected in a GC-Orbitrap with high coelution. The proposed pipeline was able to detect accurately the retention times and the spectra of the peaks. For our case, with extremely high coelution situation, 5 out of the 7 existing compounds under the selected region of interest, were correctly assessed. Finally, the comparison with the classical methods of deconvolution (i.e., MCR and AMDIS) indicates a better performance of the proposed algorithm in terms of the number of correctly resolved compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Concept of dual-resolution light field imaging using an organic photoelectric conversion film for high-resolution light field photography.

    Science.gov (United States)

    Sugimura, Daisuke; Kobayashi, Suguru; Hamamoto, Takayuki

    2017-11-01

    Light field imaging is an emerging technique that is employed to realize various applications such as multi-viewpoint imaging, focal-point changing, and depth estimation. In this paper, we propose a concept of a dual-resolution light field imaging system to synthesize super-resolved multi-viewpoint images. The key novelty of this study is the use of an organic photoelectric conversion film (OPCF), which is a device that converts spectra information of incoming light within a certain wavelength range into an electrical signal (pixel value), for light field imaging. In our imaging system, we place the OPCF having the green spectral sensitivity onto the micro-lens array of the conventional light field camera. The OPCF allows us to acquire the green spectra information only at the center viewpoint with the full resolution of the image sensor. In contrast, the optical system of the light field camera in our imaging system captures the other spectra information (red and blue) at multiple viewpoints (sub-aperture images) but with low resolution. Thus, our dual-resolution light field imaging system enables us to simultaneously capture information about the target scene at a high spatial resolution as well as the direction information of the incoming light. By exploiting these advantages of our imaging system, our proposed method enables the synthesis of full-resolution multi-viewpoint images. We perform experiments using synthetic images, and the results demonstrate that our method outperforms other previous methods.

  18. Texton-based super-resolution for achieving high spatiotemporal resolution in hybrid camera system

    Science.gov (United States)

    Kamimura, Kenji; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2010-05-01

    Many super-resolution methods have been proposed to enhance the spatial resolution of images by using iteration and multiple input images. In a previous paper, we proposed the example-based super-resolution method to enhance an image through pixel-based texton substitution to reduce the computational cost. In this method, however, we only considered the enhancement of a texture image. In this study, we modified this texton substitution method for a hybrid camera to reduce the required bandwidth of a high-resolution video camera. We applied our algorithm to pairs of high- and low-spatiotemporal-resolution videos, which were synthesized to simulate a hybrid camera. The result showed that the fine detail of the low-resolution video can be reproduced compared with bicubic interpolation and the required bandwidth could be reduced to about 1/5 in a video camera. It was also shown that the peak signal-to-noise ratios (PSNRs) of the images improved by about 6 dB in a trained frame and by 1.0-1.5 dB in a test frame, as determined by comparison with the processed image using bicubic interpolation, and the average PSNRs were higher than those obtained by the well-known Freeman’s patch-based super-resolution method. Compared with that of the Freeman’s patch-based super-resolution method, the computational time of our method was reduced to almost 1/10.

  19. Application of geostatistical simulation to compile seismotectonic provinces based on earthquake databases (case study: Iran)

    Science.gov (United States)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-04-01

    This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low

  20. Immersion Gratings for Infrared High-resolution Spectroscopy

    Science.gov (United States)

    Sarugaku, Yuki; Ikeda, Yuji; Kobayashi, Naoto; Kaji, Sayumi; Sukegawa, Takashi; Sugiyama, Shigeru; Nakagawa, Takao; Arasaki, Takayuki; Kondo, Sohei; Nakanishi, Kenshi; Yasui, Chikako; Kawakita, Hideyo

    2016-10-01

    High-resolution spectroscopy in the infrared wavelength range is essential for observations of minor isotopologues, such as HDO for water, and prebiotic organic molecules like hydrocarbons/P-bearing molecules because numerous vibrational molecular bands (including non-polar molecules) are located in this wavelength range. High spectral resolution enables us to detect weak lines without spectral line confusion. This technique has been widely used in planetary sciences, e.g., cometary coma (H2O, CO, and organic molecules), the martian atmosphere (CH4, CO2, H2O and HDO), and the upper atmosphere of gas giants (H3+ and organic molecules such as C2H6). Spectrographs with higher resolution (and higher sensitivity) still have a potential to provide a plenty of findings. However, because the size of spectrographs scales with the spectral resolution, it is difficult to realize it.Immersion grating (IG), which is a diffraction grating wherein the diffraction surface is immersed in a material with a high refractive index (n > 2), provides n times higher spectral resolution compared to a reflective grating of the same size. Because IG reduces the size of spectrograph to 1/n compared to the spectrograph with the same spectral resolution using a conventional reflective grating, it is widely acknowledged as a key optical device to realize compact spectrographs with high spectral resolution.Recently, we succeeded in fabricating a CdZnTe immersion grating with the theoretically predicted diffraction efficiency by machining process using an ultrahigh-precision five-axis processing machine developed by Canon Inc. Using the same technique, we completed a practical germanium (Ge) immersion grating with both a reflection coating on the grating surface and the an AR coating on the entrance surface. It is noteworthy that the wide wavelength range from 2 to 20 um can be covered by the two immersion gratings.In this paper, we present the performances and the applications of the immersion

  1. High resolution tomographic instrument development

    International Nuclear Information System (INIS)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational

  2. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-08-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  3. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  4. Energetics of small scale turbulence in the lower stratosphere from high resolution radar measurements

    Directory of Open Access Journals (Sweden)

    J. Dole

    Full Text Available Very high resolution radar measurements were performed in the troposphere and lower stratosphere by means of the PROUST radar. The PROUST radar operates in the UHF band (961 MHz and is located in St. Santin, France (44°39’ N, 2°12’ E. A field campaign involving high resolution balloon measurements and the PROUST radar was conducted during April 1998. Under the classical hypothesis that refractive index inhomogeneities at half radar wavelength lie within the inertial subrange, assumed to be isotropic, kinetic energy and temperature variance dissipation rates were estimated independently in the lower stratosphere. The dissipation rate of temperature variance is proportional to the dissipation rate of available potential energy. We therefore estimate the ratio of dissipation rates of potential to kinetic energy. This ratio is a key parameter of atmospheric turbulence which, in locally homogeneous and stationary conditions, is simply related to the flux Richardson number, Rf .

    Key words. Meteorology and atmospheric dynamics (turbulence – Radio science (remote sensing

  5. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    International Nuclear Information System (INIS)

    Wahid, Ali; Salim, Ahmed Mohamed Ahmed; Yusoff, Wan Ismail Wan; Gaafar, Gamal Ragab

    2016-01-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin

  6. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    Energy Technology Data Exchange (ETDEWEB)

    Wahid, Ali, E-mail: ali.wahid@live.com; Salim, Ahmed Mohamed Ahmed, E-mail: mohamed.salim@petronas.com.my; Yusoff, Wan Ismail Wan, E-mail: wanismail-wanyusoff@petronas.com.my [Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 32610 Tronoh, Perak (Malaysia); Gaafar, Gamal Ragab, E-mail: gaafargr@gmail.com [Petroleum Engineering Division, PETRONAS Carigali Sdn Bhd, Kuala Lumpur (Malaysia)

    2016-02-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.

  7. Estimation and modeling of forest attributes across large spatial scales using BiomeBGC, high-resolution imagery, LiDAR data, and inventory data

    Science.gov (United States)

    Golinkoff, Jordan Seth

    The accurate estimation of forest attributes at many different spatial scales is a critical problem. Forest landowners may be interested in estimating timber volume, forest biomass, and forest structure to determine their forest's condition and value. Counties and states may be interested to learn about their forests to develop sustainable management plans and policies related to forests, wildlife, and climate change. Countries and consortiums of countries need information about their forests to set global and national targets to deal with issues of climate change and deforestation as well as to set national targets and understand the state of their forest at a given point in time. This dissertation approaches these questions from two perspectives. The first perspective uses the process model Biome-BGC paired with inventory and remote sensing data to make inferences about a current forest state given known climate and site variables. Using a model of this type, future climate data can be used to make predictions about future forest states as well. An example of this work applied to a forest in northern California is presented. The second perspective of estimating forest attributes uses high resolution aerial imagery paired with light detection and ranging (LiDAR) remote sensing data to develop statistical estimates of forest structure. Two approaches within this perspective are presented: a pixel based approach and an object based approach. Both approaches can serve as the platform on which models (either empirical growth and yield models or process models) can be run to generate inferences about future forest state and current forest biogeochemical cycling.

  8. Transplantation of epiphytic bioaccumulators (Tillandsia capillaris) for high spatial resolution biomonitoring of trace elements and point sources deconvolution in a complex mining/smelting urban context

    Science.gov (United States)

    Goix, Sylvaine; Resongles, Eléonore; Point, David; Oliva, Priscia; Duprey, Jean Louis; de la Galvez, Erika; Ugarte, Lincy; Huayta, Carlos; Prunier, Jonathan; Zouiten, Cyril; Gardon, Jacques

    2013-12-01

    Monitoring atmospheric trace elements (TE) levels and tracing their source origin is essential for exposure assessment and human health studies. Epiphytic Tillandsia capillaris plants were used as bioaccumulator of TE in a complex polymetallic mining/smelting urban context (Oruro, Bolivia). Specimens collected from a pristine reference site were transplanted at a high spatial resolution (˜1 sample/km2) throughout the urban area. About twenty-seven elements were measured after a 4-month exposure, also providing new information values for reference material BCR482. Statistical power analysis for this biomonitoring mapping approach against classical aerosols surveys performed on the same site showed the better aptitude of T. Capillaris to detect geographical trend, and to deconvolute multiple contamination sources using geostatistical principal component analysis. Transplanted specimens in the vicinity of the mining and smelting areas were characterized by extreme TE accumulation (Sn > Ag > Sb > Pb > Cd > As > W > Cu > Zn). Three contamination sources were identified: mining (Ag, Pb, Sb), smelting (As, Sn) and road traffic (Zn) emissions, confirming results of previous aerosol survey.

  9. Downscaling GRACE Remote Sensing Datasets to High-Resolution Groundwater Storage Change Maps of California’s Central Valley

    Directory of Open Access Journals (Sweden)

    Michelle E. Miro

    2018-01-01

    Full Text Available NASA’s Gravity Recovery and Climate Experiment (GRACE has already proven to be a powerful data source for regional groundwater assessments in many areas around the world. However, the applicability of GRACE data products to more localized studies and their utility to water management authorities have been constrained by their limited spatial resolution (~200,000 km2. Researchers have begun to address these shortcomings with data assimilation approaches that integrate GRACE-derived total water storage estimates into complex regional models, producing higher-resolution estimates of hydrologic variables (~2500 km2. Here we take those approaches one step further by developing an empirically based model capable of downscaling GRACE data to a high-resolution (~16 km2 dataset of groundwater storage changes over a portion of California’s Central Valley. The model utilizes an artificial neural network to generate a series of high-resolution maps of groundwater storage change from 2002 to 2010 using GRACE estimates of variations in total water storage and a series of widely available hydrologic variables (PRISM precipitation and temperature data, digital elevation model (DEM-derived slope, and Natural Resources Conservation Service (NRCS soil type. The neural network downscaling model is able to accurately reproduce local groundwater behavior, with acceptable Nash-Sutcliffe efficiency (NSE values for calibration and validation (ranging from 0.2445 to 0.9577 and 0.0391 to 0.7511, respectively. Ultimately, the model generates maps of local groundwater storage change at a 100-fold higher resolution than GRACE gridded data products without the use of computationally intensive physical models. The model’s simulated maps have the potential for application to local groundwater management initiatives in the region.

  10. Evaluation of a high resolution silicon PET insert module

    Energy Technology Data Exchange (ETDEWEB)

    Grkovski, Milan, E-mail: milan.grkovski@ijs.si [Jožef Stefan Institute, Ljubljana (Slovenia); Memorial Sloan Kettering Cancer Center, New York, NY (United States); Brzezinski, Karol [IFIC/CSIC, Valencia (Spain); Cindro, Vladimir [Jožef Stefan Institute, Ljubljana (Slovenia); Clinthorne, Neal H. [University of Michigan, Ann Arbor, MI (United States); Kagan, Harris [Ohio State University, Columbus, OH (United States); Lacasta, Carlos [IFIC/CSIC, Valencia (Spain); Mikuž, Marko [Jožef Stefan Institute, Ljubljana (Slovenia); Solaz, Carles [IFIC/CSIC, Valencia (Spain); Studen, Andrej [Jožef Stefan Institute, Ljubljana (Slovenia); Weilhammer, Peter [Ohio State University, Columbus, OH (United States); Žontar, Dejan [Jožef Stefan Institute, Ljubljana (Slovenia)

    2015-07-11

    Conventional PET systems can be augmented with additional detectors placed in close proximity of the region of interest. We developed a high resolution PET insert module to evaluate the added benefit of such a combination. The insert module consists of two back-to-back 1 mm thick silicon sensors, each segmented into 1040 1 mm{sup 2} pads arranged in a 40 by 26 array. A set of 16 VATAGP7.1 ASICs and a custom assembled data acquisition board were used to read out the signal from the insert module. Data were acquired in slice (2D) geometry with a Jaszczak phantom (rod diameters of 1.2–4.8 mm) filled with {sup 18}F-FDG and the images were reconstructed with ML-EM method. Both data with full and limited angular coverage from the insert module were considered and three types of coincidence events were combined. The ratio of high-resolution data that substantially improves quality of the reconstructed image for the region near the surface of the insert module was estimated to be about 4%. Results from our previous studies suggest that such ratio could be achieved at a moderate technological expense by using an equivalent of two insert modules (an effective sensor thickness of 4 mm)

  11. Coded aperture subreflector array for high resolution radar imaging

    Science.gov (United States)

    Lynch, Jonathan J.; Herrault, Florian; Kona, Keerti; Virbila, Gabriel; McGuire, Chuck; Wetzel, Mike; Fung, Helen; Prophet, Eric

    2017-05-01

    HRL Laboratories has been developing a new approach for high resolution radar imaging on stationary platforms. High angular resolution is achieved by operating at 235 GHz and using a scalable tile phased array architecture that has the potential to realize thousands of elements at an affordable cost. HRL utilizes aperture coding techniques to minimize the size and complexity of the RF electronics needed for beamforming, and wafer level fabrication and integration allow tiles containing 1024 elements to be manufactured with reasonable costs. This paper describes the results of an initial feasibility study for HRL's Coded Aperture Subreflector Array (CASA) approach for a 1024 element micromachined antenna array with integrated single-bit phase shifters. Two candidate electronic device technologies were evaluated over the 170 - 260 GHz range, GaN HEMT transistors and GaAs Schottky diodes. Array structures utilizing silicon micromachining and die bonding were evaluated for etch and alignment accuracy. Finally, the overall array efficiency was estimated to be about 37% (not including spillover losses) using full wave array simulations and measured device performance, which is a reasonable value at 235 GHz. Based on the measured data we selected GaN HEMT devices operated passively with 0V drain bias due to their extremely low DC power dissipation.

  12. Depth geological model building: application to the 3D high resolution 'ANDRA' seismic block

    International Nuclear Information System (INIS)

    Mari, J.L.; Yven, B.

    2012-01-01

    Document available in extended abstract form only. 3D seismic blocks and logging data, mainly acoustic and density logs, are often used for geological model building in time. The geological model must be then converted from time to depth. Geostatistical approach for time-to-depth conversion of seismic horizons is often used in many geo-modelling projects. From a geostatistical point of view, the time-to-depth conversion of seismic horizons is a classical estimation problem involving one or more secondary variables. Bayesian approach [1] provides an excellent estimator which is more general than the traditional kriging with external drift(s) and fits very well to the needs for time-to-depth conversion of seismic horizons. The time-to-depth conversion of the selected seismic horizons is used to compute a time-to-depth conversion model at the time sampling rate (1 ms). The 3D depth conversion model allows the computation of an interval velocity block which is compared with the acoustic impedance block to estimate a density block as QC. Non realistic density values are edited and the interval velocity block as well as the depth conversion model is updated. The proposed procedure has been applied on a 3D data set. The dataset comes from a High Resolution 3D seismic survey recorded in France at the boundary of the Meuse and Haute-Marne departments in the vicinity of the Andra Center (National radioactive waste management Agency). The 3D design is a cross spread. The active spread is composed of 12 receiver lines with 120 stations each. The source lines are perpendicular to the receiver lines. The receiver and source line spacings are respectively 80 m and 120 m. The receiver and source point spacings are 20 m. The source is a Vibroseis source generating a signal in the 14 - 140 Hz frequency bandwidth.. The bin size is 10 x 10 m 2 . The nominal fold is 60. A conventional seismic sequence was applied to the data set. It includes amplitude recovery, deconvolution and wave

  13. High resolution Neutron and Synchrotron Powder Diffraction

    International Nuclear Information System (INIS)

    Hewat, A.W.

    1986-01-01

    The use of high-resolution powder diffraction has grown rapidly in the past years, with the development of Rietveld (1967) methods of data analysis and new high-resolution diffractometers and multidetectors. The number of publications in this area has increased from a handful per year until 1973 to 150 per year in 1984, with a ten-year total of over 1000. These papers cover a wide area of solid state-chemistry, physics and materials science, and have been grouped under 20 subject headings, ranging from catalysts to zeolites, and from battery electrode materials to pre-stressed superconducting wires. In 1985 two new high-resolution diffractometers are being commissioned, one at the SNS laboratory near Oxford, and one at the ILL in Grenoble. In different ways these machines represent perhaps the ultimate that can be achieved with neutrons and will permit refinement of complex structures with about 250 parameters and unit cell volumes of about 2500 Angstrom/sp3/. The new European Synchotron Facility will complement the Grenoble neutron diffractometers, and extend the role of high-resolution powder diffraction to the direct solution of crystal structures, pioneered in Sweden

  14. Using High-Resolution Satellite Aerosol Optical Depth To Estimate Daily PM2.5 Geographical Distribution in Mexico City.

    Science.gov (United States)

    Just, Allan C; Wright, Robert O; Schwartz, Joel; Coull, Brent A; Baccarelli, Andrea A; Tellez-Rojo, Martha María; Moody, Emily; Wang, Yujie; Lyapustin, Alexei; Kloog, Itai

    2015-07-21

    Recent advances in estimating fine particle (PM2.5) ambient concentrations use daily satellite measurements of aerosol optical depth (AOD) for spatially and temporally resolved exposure estimates. Mexico City is a dense megacity that differs from other previously modeled regions in several ways: it has bright land surfaces, a distinctive climatological cycle, and an elevated semi-enclosed air basin with a unique planetary boundary layer dynamic. We extend our previous satellite methodology to the Mexico City area, a region with higher PM2.5 than most U.S. and European urban areas. Using a novel 1 km resolution AOD product from the MODIS instrument, we constructed daily predictions across the greater Mexico City area for 2004-2014. We calibrated the association of AOD to PM2.5 daily using municipal ground monitors, land use, and meteorological features. Predictions used spatial and temporal smoothing to estimate AOD when satellite data were missing. Our model performed well, resulting in an out-of-sample cross-validation R(2) of 0.724. Cross-validated root-mean-squared prediction error (RMSPE) of the model was 5.55 μg/m(3). This novel model reconstructs long- and short-term spatially resolved exposure to PM2.5 for epidemiological studies in Mexico City.

  15. High resolution (transformers.

    Science.gov (United States)

    Garcia-Souto, Jose A; Lamela-Rivera, Horacio

    2006-10-16

    A novel fiber-optic interferometric sensor is presented for vibrations measurements and analysis. In this approach, it is shown applied to the vibrations of electrical structures within power transformers. A main feature of the sensor is that an unambiguous optical phase measurement is performed using the direct detection of the interferometer output, without external modulation, for a more compact and stable implementation. High resolution of the interferometric measurement is obtained with this technique (transformers are also highlighted.

  16. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    Science.gov (United States)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  17. Estimation of Leakage Potential of Selected Sites in Interstate and Tri-State Canals Using Geostatistical Analysis of Selected Capacitively Coupled Resistivity Profiles, Western Nebraska, 2004

    Science.gov (United States)

    Vrabel, Joseph; Teeple, Andrew; Kress, Wade H.

    2009-01-01

    With increasing demands for reliable water supplies and availability estimates, groundwater flow models often are developed to enhance understanding of surface-water and groundwater systems. Specific hydraulic variables must be known or calibrated for the groundwater-flow model to accurately simulate current or future conditions. Surface geophysical surveys, along with selected test-hole information, can provide an integrated framework for quantifying hydrogeologic conditions within a defined area. In 2004, the U.S. Geological Survey, in cooperation with the North Platte Natural Resources District, performed a surface geophysical survey using a capacitively coupled resistivity technique to map the lithology within the top 8 meters of the near-surface for 110 kilometers of the Interstate and Tri-State Canals in western Nebraska and eastern Wyoming. Assuming that leakage between the surface-water and groundwater systems is affected primarily by the sediment directly underlying the canal bed, leakage potential was estimated from the simple vertical mean of inverse-model resistivity values for depth levels with geometrically increasing layer thickness with depth which resulted in mean-resistivity values biased towards the surface. This method generally produced reliable results, but an improved analysis method was needed to account for situations where confining units, composed of less permeable material, underlie units with greater permeability. In this report, prepared by the U.S. Geological Survey in cooperation with the North Platte Natural Resources District, the authors use geostatistical analysis to develop the minimum-unadjusted method to compute a relative leakage potential based on the minimum resistivity value in a vertical column of the resistivity model. The minimum-unadjusted method considers the effects of homogeneous confining units. The minimum-adjusted method also is developed to incorporate the effect of local lithologic heterogeneity on water

  18. High-resolution wavefront control of high-power laser systems

    International Nuclear Information System (INIS)

    Brase, J.; Brown, C.; Carrano, C.; Kartz, M.; Olivier, S.; Pennington, D.; Silva, D.

    1999-01-01

    Nearly every new large-scale laser system application at LLNL has requirements for beam control which exceed the current level of available technology. For applications such as inertial confinement fusion, laser isotope separation, laser machining, and laser the ability to transport significant power to a target while maintaining good beam quality is critical. There are many ways that laser wavefront quality can be degraded. Thermal effects due to the interaction of high-power laser or pump light with the internal optical components or with the ambient gas are common causes of wavefront degradation. For many years, adaptive optics based on thing deformable glass mirrors with piezoelectric or electrostrictive actuators have be used to remove the low-order wavefront errors from high-power laser systems. These adaptive optics systems have successfully improved laser beam quality, but have also generally revealed additional high-spatial-frequency errors, both because the low-order errors have been reduced and because deformable mirrors have often introduced some high-spatial-frequency components due to manufacturing errors. Many current and emerging laser applications fall into the high-resolution category where there is an increased need for the correction of high spatial frequency aberrations which requires correctors with thousands of degrees of freedom. The largest Deformable Mirrors currently available have less than one thousand degrees of freedom at a cost of approximately $1M. A deformable mirror capable of meeting these high spatial resolution requirements would be cost prohibitive. Therefore a new approach using a different wavefront control technology is needed. One new wavefront control approach is the use of liquid-crystal (LC) spatial light modulator (SLM) technology for the controlling the phase of linearly polarized light. Current LC SLM technology provides high-spatial-resolution wavefront control, with hundreds of thousands of degrees of freedom, more

  19. Integration of dynamical data in a geostatistical model of reservoir; Integration des donnees dynamiques dans un modele geostatistique de reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Costa Reis, L.

    2001-01-01

    We have developed in this thesis a methodology of integrated characterization of heterogeneous reservoirs, from geologic modeling to history matching. This methodology is applied to the reservoir PBR, situated in Campos Basin, offshore Brazil, which has been producing since June 1979. This work is an extension of two other thesis concerning geologic and geostatistical modeling of the reservoir PBR from well data and seismic information. We extended the geostatistical litho-type model to the whole reservoir by using a particular approach of the non-stationary truncated Gaussian simulation method. This approach facilitated the application of the gradual deformation method to history matching. The main stages of the methodology for dynamic data integration in a geostatistical reservoir model are presented. We constructed a reservoir model and the initial difficulties in the history matching led us to modify some choices in the geological, geostatistical and flow models. These difficulties show the importance of dynamic data integration in reservoir modeling. The petrophysical property assignment within the litho-types was done by using well test data. We used an inversion procedure to evaluate the petrophysical parameters of the litho-types. The up-scaling is a necessary stage to reduce the flow simulation time. We compared several up-scaling methods and we show that the passage from the fine geostatistical model to the coarse flow model should be done very carefully. The choice of the fitting parameter depends on the objective of the study. In the case of the reservoir PBR, where water is injected in order to improve the oil recovery, the water rate of the producing wells is directly related to the reservoir heterogeneity. Thus, the water rate was chosen as the fitting parameter. We obtained significant improvements in the history matching of the reservoir PBR. First, by using a method we have proposed, called patchwork. This method allows us to built a coherent

  20. High resolution optical DNA mapping

    Science.gov (United States)

    Baday, Murat

    Many types of diseases including cancer and autism are associated with copy-number variations in the genome. Most of these variations could not be identified with existing sequencing and optical DNA mapping methods. We have developed Multi-color Super-resolution technique, with potential for high throughput and low cost, which can allow us to recognize more of these variations. Our technique has made 10--fold improvement in the resolution of optical DNA mapping. Using a 180 kb BAC clone as a model system, we resolved dense patterns from 108 fluorescent labels of two different colors representing two different sequence-motifs. Overall, a detailed DNA map with 100 bp resolution was achieved, which has the potential to reveal detailed information about genetic variance and to facilitate medical diagnosis of genetic disease.

  1. High-Resolution Electronics: Spontaneous Patterning of High-Resolution Electronics via Parallel Vacuum Ultraviolet (Adv. Mater. 31/2016).

    Science.gov (United States)

    Liu, Xuying; Kanehara, Masayuki; Liu, Chuan; Sakamoto, Kenji; Yasuda, Takeshi; Takeya, Jun; Minari, Takeo

    2016-08-01

    On page 6568, T. Minari and co-workers describe spontaneous patterning based on the parallel vacuum ultraviolet (PVUV) technique, enabling the homogeneous integration of complex, high-resolution electronic circuits, even on large-scale, flexible, transparent substrates. Irradiation of PVUV to the hydrophobic polymer surface precisely renders the selected surface into highly wettable regions with sharply defined boundaries, which spontaneously guides a metal nanoparticle ink into a series of circuit lines and gaps with the widths down to a resolution of 1 μm. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. High resolution UV spectroscopy and laser-focused nanofabrication

    NARCIS (Netherlands)

    Myszkiewicz, G.

    2005-01-01

    This thesis combines two at first glance different techniques: High Resolution Laser Induced Fluorescence Spectroscopy (LIF) of small aromatic molecules and Laser Focusing of atoms for Nanofabrication. The thesis starts with the introduction to the high resolution LIF technique of small aromatic

  3. High-resolution spectrometer at PEP

    International Nuclear Information System (INIS)

    Weiss, J.M.; HRS Collaboration.

    1982-01-01

    A description is presented of the High Resolution Spectrometer experiment (PEP-12) now running at PEP. The advanced capabilities of the detector are demonstrated with first physics results expected in the coming months

  4. High-resolution structure of the native histone octamer

    International Nuclear Information System (INIS)

    Wood, Christopher M.; Nicholson, James M.; Lambert, Stanley J.; Chantalat, Laurent; Reynolds, Colin D.; Baldwin, John P.

    2005-01-01

    The high-resolution (1.90 Å) model of the native histone octamer allows structural comparisons to be made with the nucleosome-core particle, along with an identification of a likely core-histone binding site. Crystals of native histone octamers (H2A–H2B)–(H4–H3)–(H3′–H4′)–(H2B′–H2A′) from chick erythrocytes in 2 M KCl, 1.35 M potassium phosphate pH 6.9 diffract X-rays to 1.90 Å resolution, yielding a structure with an R work value of 18.7% and an R free of 22.2%. The crystal space group is P6 5 , the asymmetric unit of which contains one complete octamer. This high-resolution model of the histone-core octamer allows further insight into intermolecular interactions, including water molecules, that dock the histone dimers to the tetramer in the nucleosome-core particle and have relevance to nucleosome remodelling. The three key areas analysed are the H2A′–H3–H4 molecular cluster (also H2A–H3′–H4′), the H4–H2B′ interaction (also H4′–H2B) and the H2A′–H4 β-sheet interaction (also H2A–H4′). The latter of these three regions is important to nucleosome remodelling by RNA polymerase II, as it is shown to be a likely core-histone binding site, and its disruption creates an instability in the nucleosome-core particle. A majority of the water molecules in the high-resolution octamer have positions that correlate to similar positions in the high-resolution nucleosome-core particle structure, suggesting that the high-resolution octamer model can be used for comparative studies with the high-resolution nucleosome-core particle

  5. Geostatistical and stratigraphic analysis of deltaic reservoirs from the Reconcavo Basin, Brazil; Analise estratigrafica e geoestatistica de reservatorios deltaicos da Bacia do Reconcavo (BA)

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Carlos Moreira

    1997-07-01

    This study presents the characterization of the external geometry of deltaic oil reservoirs, including the description of their areal distribution using geo statistic tools, such as variography and kriging. A high-resolution stratigraphic study was developed over a 25 km{sup 2} area, by using data from 276 closely-spaced wells of an oil-producer field from the Reconcavo Basin, northeastern Brazil. The studied succession records the progressive lacustrine transgression of a deltaic environment. Core data and stratigraphic cross sections suggest that the oil reservoirs are mostly amalgamated, delta-front lobes, and subordinately, crevasse deposits. Some important geometrical elements were recognized by the detailed variographic analysis developed for each stratigraphic unit (zone). The average width for the groups of deltaic lobes of one zone was measured from the variographic feature informally named as hole effect. This procedure was not possible for the other zones due to the intense lateral amalgamation of sandstones, indicated by many variographic nested structures. Net sand krigged maps for the main zones suggest a NNW-SSE orientation for the deltaic lobes, as also their common amalgamation and compensation arrangements. High-resolution stratigraphic analyses should include a more regional characterization of the depositional system that comprises the studied succession. On the other hand, geostatistical studies should be developed only after the recognition of the depositional processes acting in the study area and the geological meaning of the variable to be treated, including its spatial variability scales as a function of sand body thickness, orientation and amalgamation. (author)

  6. Requirements on high resolution detectors

    Energy Technology Data Exchange (ETDEWEB)

    Koch, A. [European Synchrotron Radiation Facility, Grenoble (France)

    1997-02-01

    For a number of microtomography applications X-ray detectors with a spatial resolution of 1 {mu}m are required. This high spatial resolution will influence and degrade other parameters of secondary importance like detective quantum efficiency (DQE), dynamic range, linearity and frame rate. This note summarizes the most important arguments, for and against those detector systems which could be considered. This article discusses the mutual dependencies between the various figures which characterize a detector, and tries to give some ideas on how to proceed in order to improve present technology.

  7. Aggregation-cokriging for highly multivariate spatial data

    KAUST Repository

    Furrer, R.; Genton, M. G.

    2011-01-01

    Best linear unbiased prediction of spatially correlated multivariate random processes, often called cokriging in geostatistics, requires the solution of a large linear system based on the covariance and cross-covariance matrix of the observations. For many problems of practical interest, it is impossible to solve the linear system with direct methods. We propose an efficient linear unbiased predictor based on a linear aggregation of the covariables. The primary variable together with this single meta-covariable is used to perform cokriging. We discuss the optimality of the approach under different covariance structures, and use it to create reanalysis type high-resolution historical temperature fields. © 2011 Biometrika Trust.

  8. Aggregation-cokriging for highly multivariate spatial data

    KAUST Repository

    Furrer, R.

    2011-08-26

    Best linear unbiased prediction of spatially correlated multivariate random processes, often called cokriging in geostatistics, requires the solution of a large linear system based on the covariance and cross-covariance matrix of the observations. For many problems of practical interest, it is impossible to solve the linear system with direct methods. We propose an efficient linear unbiased predictor based on a linear aggregation of the covariables. The primary variable together with this single meta-covariable is used to perform cokriging. We discuss the optimality of the approach under different covariance structures, and use it to create reanalysis type high-resolution historical temperature fields. © 2011 Biometrika Trust.

  9. Novel geometrical concept of a high-performance brain PET scanner. Principle, design and performance estimates

    International Nuclear Information System (INIS)

    Seguinot, J.; Braem, A.; Chesi, E.

    2006-01-01

    We present the principle, a possible implementation and performance estimates of a novel geometrical concept for a high-resolution positron emission tomograph. The concept, which can be for example implemented in a brain PET device, promises to lead to an essentially parallax-free 3D image reconstruction with excellent spatial resolution and contrast, uniform over the complete field of view. The key components are matrices of long axially oriented scintillator crystals which are read out at both extremities by segmented Hybrid Photon Detectors. We discuss the relevant design considerations for a 3D axial PET camera module, motivate parameter and material choices, and estimate its performance in terms of spatial and energy resolution. We support these estimates by Monte Carlo simulations and in some cases by first experimental results. From the performance of a camera module, we extrapolate to the reconstruction resolution of a 3D axial PET scanner in a semi-analytical way and compare it to an existing state-of-the art brain PET device. We finally describe a dedicated data acquisition system, capable to fully exploit the advantages of the proposed concept

  10. High-resolution clean-sc

    NARCIS (Netherlands)

    Sijtsma, P.; Snellen, M.

    2016-01-01

    In this paper a high-resolution extension of CLEAN-SC is proposed: HR-CLEAN-SC. Where CLEAN-SC uses peak sources in “dirty maps” to define so-called source components, HR-CLEAN-SC takes advantage of the fact that source components can likewise be derived from points at some distance from the peak,

  11. Regional soil erosion assessment based on a sample survey and geostatistics

    Directory of Open Access Journals (Sweden)

    S. Yin

    2018-03-01

    (SRTM digital elevation model (DEM data worsened the estimation when used as the covariates for the interpolation of soil loss. Due to the unavailability of a 1 : 10 000 topography map for the entire area in this study, the model assisted by the land use, R, and K factors, with a resolution of 250 m, was used to generate the regional assessment of the soil erosion for Shaanxi Province. It demonstrated that 54.3 % of total land in Shaanxi Province had annual soil loss equal to or greater than 5 t ha−1 yr−1. High (20–40 t ha−1 yr−1, severe (40–80 t ha−1 yr−1, and extreme ( >  80 t ha−1 yr−1 erosion occupied 14.0 % of the total land. The dry land and irrigated land, forest, shrubland, and grassland in Shaanxi Province had mean soil loss rates of 21.77, 3.51, 10.00, and 7.27 t ha−1 yr−1, respectively. Annual soil loss was about 207.3 Mt in Shaanxi Province, with 68.9 % of soil loss originating from the farmlands and grasslands in Yan'an and Yulin districts in the northern Loess Plateau region and Ankang and Hanzhong districts in the southern Qingba mountainous region. This methodology provides a more accurate regional soil erosion assessment and can help policymakers to take effective measures to mediate soil erosion risks.

  12. Planning for shallow high resolution seismic surveys

    CSIR Research Space (South Africa)

    Fourie, CJS

    2008-11-01

    Full Text Available of the input wave. This information can be used in conjunction with this spreadsheet to aid the geophysicist in designing shallow high resolution seismic surveys to achieve maximum resolution and penetration. This Excel spreadsheet is available free from...

  13. High-heat tank safety issue resolution program plan. Revision 2

    International Nuclear Information System (INIS)

    Wang, O.S.

    1994-12-01

    The purpose of this program plan is to provide a guide for selecting corrective actions that will mitigate and/or remediate the high-heat waste tank safety issue for single-shell tank 241-C-106. The heat source of approximately 110,000 Btu/hr is the radioactive decay of the stored waste material (primarily 90 Sr) inadvertently transferred into the tank in the later 1960s. Currently, forced ventilation, with added water to promote thermal conductivity and evaporation cooling, is used for heat removal. The method is very effective and economical. At this time, the only viable solution identified to permanently resolve this safety issue is the removal of heat-generating waste in the tank. This solution is being aggressively pursued as the only remediation method to this safety issue, and tank 241-C-106 has been selected as the first single-shell tank for retrieval. The current cooling method and other alternatives are addressed in this program as means to mitigate this safety issue before retrieval. This program plan has three parts. The first part establishes program objectives and defines safety issue, drivers, and resolution criteria and strategy. The second part evaluates the high-heat safety issue and its mitigation and remediation methods and other alternatives according to resolution logic. The third part identifies major tasks and alternatives for mitigation and resolution of the safety issue. A table of best-estimate schedules for the key tasks is also included in this program plan

  14. Gamma-ray spectrometer system with high efficiency and high resolution

    International Nuclear Information System (INIS)

    Moss, C.E.; Bernard, W.; Dowdy, E.J.; Garcia, C.; Lucas, M.C.; Pratt, J.C.

    1983-01-01

    Our gamma-ray spectrometer system, designed for field use, offers high efficiency and high resolution for safeguards applications. The system consists of three 40% high-purity germanium detectors and a LeCroy 3500 data acquisition system that calculates a composite spectrum for the three detectors. The LeCroy 3500 mainframe can be operated remotely from the detector array with control exercised through modems and the telephone system. System performance with a mixed source of 125 Sb, 154 Eu, and 155 Eu confirms the expected efficiency of 120% with the overall resolution showing little degradation over that of the worst detector

  15. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  16. High-resolution X-ray diffraction studies of multilayers

    DEFF Research Database (Denmark)

    Christensen, Finn Erland; Hornstrup, Allan; Schnopper, H. W.

    1988-01-01

    High-resolution X-ray diffraction studies of the perfection of state-of-the-art multilayers are presented. Data were obtained using a triple-axis perfect-crystal X-ray diffractometer. Measurements reveal large-scale figure errors in the substrate. A high-resolution triple-axis set up is required...

  17. MODELING AND SIMULATION OF HIGH RESOLUTION OPTICAL REMOTE SENSING SATELLITE GEOMETRIC CHAIN

    Directory of Open Access Journals (Sweden)

    Z. Xia

    2018-04-01

    Full Text Available The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.

  18. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    Science.gov (United States)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  19. Geostatistical Characteristic of Space -Time Variation in Underground Water Selected Quality Parameters in Klodzko Water Intake Area (SW Part of Poland)

    Science.gov (United States)

    Namysłowska-Wilczyńska, Barbara

    2016-04-01

    . These data were subjected to spatial analyses using statistical and geostatistical methods. The evaluation of basic statistics of the investigated quality parameters, including their histograms of distributions, scatter diagrams between these parameters and also correlation coefficients r were presented in this article. The directional semivariogram function and the ordinary (block) kriging procedure were used to build the 3D geostatistical model. The geostatistical parameters of the theoretical models of directional semivariograms of the studied water quality parameters, calculated along the time interval and along the wells depth (taking into account the terrain elevation), were used in the ordinary (block) kriging estimation. The obtained results of estimation, i.e. block diagrams allowed to determine the levels of increased values Z* of studied underground water quality parameters. Analysis of the variability in the selected quality parameters of underground water for an analyzed area in Klodzko water intake was enriched by referring to the results of geostatistical studies carried out for underground water quality parameters and also for a treated water and in Klodzko water supply system (iron Fe, manganese Mn, ammonium ion NH4+ contents), discussed in earlier works. Spatial and time variation in the latter-mentioned parameters was analysed on the basis of the data (2007÷2011, 2008÷2011). Generally, the behaviour of the underground water quality parameters has been found to vary in space and time. Thanks to the spatial analyses of the variation in the quality parameters in the Kłodzko underground water intake area some regularities (trends) in the variation in water quality have been identified.

  20. Isotope specific resolution recovery image reconstruction in high resolution PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kotasidis, Fotis A. [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, CH-1211 Geneva, Switzerland and Wolfson Molecular Imaging Centre, MAHSC, University of Manchester, M20 3LJ, Manchester (United Kingdom); Angelis, Georgios I. [Faculty of Health Sciences, Brain and Mind Research Institute, University of Sydney, NSW 2006, Sydney (Australia); Anton-Rodriguez, Jose; Matthews, Julian C. [Wolfson Molecular Imaging Centre, MAHSC, University of Manchester, Manchester M20 3LJ (United Kingdom); Reader, Andrew J. [Montreal Neurological Institute, McGill University, Montreal QC H3A 2B4, Canada and Department of Biomedical Engineering, Division of Imaging Sciences and Biomedical Engineering, King' s College London, St. Thomas’ Hospital, London SE1 7EH (United Kingdom); Zaidi, Habib [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, CH-1211 Geneva (Switzerland); Geneva Neuroscience Centre, Geneva University, CH-1205 Geneva (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, PO Box 30 001, Groningen 9700 RB (Netherlands)

    2014-05-15

    Purpose: Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. Methods: In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. Results: The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Conclusions: Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution

  1. Isotope specific resolution recovery image reconstruction in high resolution PET imaging

    International Nuclear Information System (INIS)

    Kotasidis, Fotis A.; Angelis, Georgios I.; Anton-Rodriguez, Jose; Matthews, Julian C.; Reader, Andrew J.; Zaidi, Habib

    2014-01-01

    Purpose: Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. Methods: In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. Results: The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Conclusions: Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution

  2. Isotope specific resolution recovery image reconstruction in high resolution PET imaging.

    Science.gov (United States)

    Kotasidis, Fotis A; Angelis, Georgios I; Anton-Rodriguez, Jose; Matthews, Julian C; Reader, Andrew J; Zaidi, Habib

    2014-05-01

    Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution recovery image reconstruction. The

  3. STAMMEX high resolution gridded daily precipitation dataset over Germany: a new potential for regional precipitation climate research

    Science.gov (United States)

    Zolina, Olga; Simmer, Clemens; Kapala, Alice; Mächel, Hermann; Gulev, Sergey; Groisman, Pavel

    2014-05-01

    We present new high resolution precipitation daily grids developed at Meteorological Institute, University of Bonn and German Weather Service (DWD) under the STAMMEX project (Spatial and Temporal Scales and Mechanisms of Extreme Precipitation Events over Central Europe). Daily precipitation grids have been developed from the daily-observing precipitation network of DWD, which runs one of the World's densest rain gauge networks comprising more than 7500 stations. Several quality-controlled daily gridded products with homogenized sampling were developed covering the periods 1931-onwards (with 0.5 degree resolution), 1951-onwards (0.25 degree and 0.5 degree), and 1971-2000 (0.1 degree). Different methods were tested to select the best gridding methodology that minimizes errors of integral grid estimates over hilly terrain. Besides daily precipitation values with uncertainty estimates (which include standard estimates of the kriging uncertainty as well as error estimates derived by a bootstrapping algorithm), the STAMMEX data sets include a variety of statistics that characterize temporal and spatial dynamics of the precipitation distribution (quantiles, extremes, wet/dry spells, etc.). Comparisons with existing continental-scale daily precipitation grids (e.g., CRU, ECA E-OBS, GCOS) which include considerably less observations compared to those used in STAMMEX, demonstrate the added value of high-resolution grids for extreme rainfall analyses. These data exhibit spatial variability pattern and trends in precipitation extremes, which are missed or incorrectly reproduced over Central Europe from coarser resolution grids based on sparser networks. The STAMMEX dataset can be used for high-quality climate diagnostics of precipitation variability, as a reference for reanalyses and remotely-sensed precipitation products (including the upcoming Global Precipitation Mission products), and for input into regional climate and operational weather forecast models. We will present

  4. A 2D eye gaze estimation system with low-resolution webcam images

    Directory of Open Access Journals (Sweden)

    Kim Jin

    2011-01-01

    Full Text Available Abstract In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI algorithm. Deformable template-based 2D gaze estimation (DTBGE algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Right and left eye estimates are evaluated separately. DAISMI finds the stable approximate pupil-center location by calculating the mass-center of eyeball border vertices to be employed for initial deformable template alignment. DTBGE starts running with initial alignment and updates the template alignment with resulting eye movements and eyeball size frame by frame. The horizontal and vertical deviation of eye movements through eyeball size is considered as if it is directly proportional with the deviation of cursor movements in a certain screen size and resolution. The core advantage of the system is that it does not employ the real pupil-center as a reference point for gaze estimation which is more reliable against corneal reflection. Visual angle accuracy is used for the evaluation and benchmarking of the system. Effectiveness of the proposed system is presented and experimental results are shown.

  5. Scalable Algorithms for Large High-Resolution Terrain Data

    DEFF Research Database (Denmark)

    Mølhave, Thomas; Agarwal, Pankaj K.; Arge, Lars Allan

    2010-01-01

    In this paper we demonstrate that the technology required to perform typical GIS computations on very large high-resolution terrain models has matured enough to be ready for use by practitioners. We also demonstrate the impact that high-resolution data has on common problems. To our knowledge, so...

  6. High resolution NMR imaging using a high field yokeless permanent magnet.

    Science.gov (United States)

    Kose, Katsumi; Haishi, Tomoyuki

    2011-01-01

    We measured the homogeneity and stability of the magnetic field of a high field (about 1.04 tesla) yokeless permanent magnet with 40-mm gap for high resolution nuclear magnetic resonance (NMR) imaging. Homogeneity was evaluated using a 3-dimensional (3D) lattice phantom and 3D spin-echo imaging sequences. In the central sphere (20-mm diameter), peak-to-peak magnetic field inhomogeneity was about 60 ppm, and the root-mean-square was 8 ppm. We measured room temperature, magnet temperature, and NMR frequency of the magnet simultaneously every minute for about 68 hours with and without the thermal insulator of the magnet. A simple mathematical model described the magnet's thermal property. Based on magnet performance, we performed high resolution (up to [20 µm](2)) imaging with internal NMR lock sequences of several biological samples. Our results demonstrated the usefulness of the high field small yokeless permanent magnet for high resolution NMR imaging.

  7. High resolution NMR imaging using a high field yokeless permanent magnet

    International Nuclear Information System (INIS)

    Kose, Katsumi; Haishi, Tomoyuki

    2011-01-01

    We measured the homogeneity and stability of the magnetic field of a high field (about 1.04 tesla) yokeless permanent magnet with 40-mm gap for high resolution nuclear magnetic resonance (NMR) imaging. Homogeneity was evaluated using a 3-dimensional (3D) lattice phantom and 3D spin-echo imaging sequences. In the central sphere (20-mm diameter), peak-to-peak magnetic field inhomogeneity was about 60 ppm, and the root-mean-square was 8 ppm. We measured room temperature, magnet temperature, and NMR frequency of the magnet simultaneously every minute for about 68 hours with and without the thermal insulator of the magnet. A simple mathematical model described the magnet's thermal property. Based on magnet performance, we performed high resolution (up to [20 μm] 2 ) imaging with internal NMR lock sequences of several biological samples. Our results demonstrated the usefulness of the high field small yokeless permanent magnet for high resolution NMR imaging. (author)

  8. Fast iterative segmentation of high resolution medical images

    International Nuclear Information System (INIS)

    Hebert, T.J.

    1996-01-01

    Various applications in positron emission tomography (PET), single photon emission computed tomography (SPECT) and magnetic resonance imaging (MRI) require segmentation of 20 to 60 high resolution images of size 256x256 pixels in 3-9 seconds per image. This places particular constraints on the design of image segmentation algorithms. This paper examines the trade-offs in segmenting images based on fitting a density function to the pixel intensities using curve-fitting versus the maximum likelihood method. A quantized data representation is proposed and the EM algorithm for fitting a finite mixture density function to the quantized representation for an image is derived. A Monte Carlo evaluation of mean estimation error and classification error showed that the resulting quantized EM algorithm dramatically reduces the required computation time without loss of accuracy

  9. Use of geostatistics on broiler production for evaluation of different minimum ventilation systems during brooding phase

    Directory of Open Access Journals (Sweden)

    Thayla Morandi Ridolfi de Carvalho

    2012-01-01

    Full Text Available The objective of this research was to evaluate different minimum ventilation systems, in relation to air quality and thermal comfort using geostatistics in brooding phase. The minimum ventilation systems were: Blue House I: exhaust fans + curtain management (end of the building; Blue House II: exhaust fans + side curtain management; and Dark House: exhaust fans + flag. The climate variables evaluated were: dry bulb temperature, relative humidity, air velocity, carbon dioxide and ammonia concentration, during winter time, at 9 a.m., in 80 equidistant points in brooding area. Data were evaluated by geostatistic technique. The results indicate that Wider broiler houses (above 15.0 m width present the greatest ammonia and humidity concentration. Blue House II present the best results in relation to air quality. However, none of the studied broiler houses present an ideal thermal comfort.

  10. Progress in high-resolution x-ray holographic microscopy

    International Nuclear Information System (INIS)

    Jacobsen, C.; Kirz, J.; Howells, M.; McQuaid, K.; Rothman, S.; Feder, R.; Sayre, D.

    1987-07-01

    Among the various types of x-ray microscopes that have been demonstrated, the holographic microscope has had the largest gap between promise and performance. The difficulties of fabricating x-ray optical elements have led some to view holography as the most attractive method for obtaining the ultimate in high resolution x-ray micrographs; however, we know of no investigations prior to 1987 that clearly demonstrated submicron resolution in reconstructed images. Previous efforts suffered from problems such as limited resolution and dynamic range in the recording media, low coherent x-ray flux, and aberrations and diffraction limits in visible light reconstruction. We have addressed the recording limitations through the use of an undulator x-ray source and high-resolution photoresist recording media. For improved results in the readout and reconstruction steps, we have employed metal shadowing and transmission electron microscopy, along with numerical reconstruction techniques. We believe that this approach will allow holography to emerge as a practical method of high-resolution x-ray microscopy. 30 refs., 4 figs

  11. Progress in high-resolution x-ray holographic microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsen, C.; Kirz, J.; Howells, M.; McQuaid, K.; Rothman, S.; Feder, R.; Sayre, D.

    1987-07-01

    Among the various types of x-ray microscopes that have been demonstrated, the holographic microscope has had the largest gap between promise and performance. The difficulties of fabricating x-ray optical elements have led some to view holography as the most attractive method for obtaining the ultimate in high resolution x-ray micrographs; however, we know of no investigations prior to 1987 that clearly demonstrated submicron resolution in reconstructed images. Previous efforts suffered from problems such as limited resolution and dynamic range in the recording media, low coherent x-ray flux, and aberrations and diffraction limits in visible light reconstruction. We have addressed the recording limitations through the use of an undulator x-ray source and high-resolution photoresist recording media. For improved results in the readout and reconstruction steps, we have employed metal shadowing and transmission electron microscopy, along with numerical reconstruction techniques. We believe that this approach will allow holography to emerge as a practical method of high-resolution x-ray microscopy. 30 refs., 4 figs.

  12. High-resolution spectroscopy of gases for industrial applications

    DEFF Research Database (Denmark)

    Fateev, Alexander; Clausen, Sønnik

    High-resolution spectroscopy of gases is a powerful technique which has various fundamental and practical applications: in situ simultaneous measurements of gas temperature and gas composition, radiative transfer modeling, validation of existing and developing of new databases and etc. Existing...... databases (e.g. HITRAN, HITEMP or CDSD) can normally be used for absorption spectra calculations at limited temperature/pressure ranges. Therefore experimental measurements of absorption/transmission spectra gases (e.g. CO2, H2O or SO2) at high-resolution and elevated temperatures are essential both...... for analysis of complex experimental data and further development of the databases. High-temperature gas cell facilities available at DTU Chemical Engineering are presented and described. The gas cells and high-resolution spectrometers allow us to perform high-quality reference measurements of gases relevant...

  13. Short Communication. Using high resolution UAV imagery to estimate tree variables in Pinus pinea plantation in Portugal

    Directory of Open Access Journals (Sweden)

    Juan Guerra Hernandez

    2016-07-01

    Research highlights: The results demonstrate that tree variables can be automatically extracted from high resolution imagery. We highlight the use of UAV systems as a fast, reliable and cost‑effective technique for small scale applications. Keywords: Unmanned aerial systems (UAS; forest inventory; tree crown variables; 3D image modelling; canopy height model (CHM; object‑based image analysis (OBIA, structure‑from‑motion (SfM.

  14. Characterizing Arctic Sea Ice Topography Using High-Resolution IceBridge Data

    Science.gov (United States)

    Petty, Alek; Tsamados, Michel; Kurtz, Nathan; Farrell, Sinead; Newman, Thomas; Harbeck, Jeremy; Feltham, Daniel; Richter-Menge, Jackie

    2016-01-01

    We present an analysis of Arctic sea ice topography using high resolution, three-dimensional, surface elevation data from the Airborne Topographic Mapper, flown as part of NASA's Operation IceBridge mission. Surface features in the sea ice cover are detected using a newly developed surface feature picking algorithm. We derive information regarding the height, volume and geometry of surface features from 2009-2014 within the Beaufort/Chukchi and Central Arctic regions. The results are delineated by ice type to estimate the topographic variability across first-year and multi-year ice regimes.

  15. High-resolution precipitation database for the last two centuries in Italy: climatologies and anomalies

    Science.gov (United States)

    Crespi, Alice; Brunetti, Michele; Maugeri, Maurizio

    2017-04-01

    The availability of gridded high-resolution spatial climatologies and corresponding secular records has acquired an increasing importance in the recent years both to research purposes and as decision-support tools in the management of natural resources and economical activities. High-resolution monthly precipitation climatologies for Italy were computed by gridding on a 30-arc-second-resolution Digital Elevation Model (DEM) the precipitation normals (1961-1990) obtained from a quality-controlled dataset of about 6200 stations covering the Italian surface and part of the Northern neighbouring regions. Starting from the assumption that the precipitation distribution is strongly influenced by orography, especially elevation, a local weighted linear regression (LWLR) of precipitation versus elevation was performed at each DEM cell. The regression coefficients for each cell were estimated by selecting the stations with the highest weights in which the distances and the level of similarity between the station cells and the considered grid cell, in terms of orographic features, are taken into account. An optimisation procedure was then set up in order to define, for each month and for each grid cell, the most suitable decreasing coefficients for the weighting factors which enter in the LWLR scheme. The model was validated by the comparison with the results provided by inverse distance weighting (IDW) applied both to station normals and to the residuals of a global regression of station normals versus elevation. In both cases, the LWLR leave-one-out reconstructions show the best agreement with the observed station normals, especially when considering specific station clusters (high elevation sites for example). After producing the high-resolution precipitation climatological field, the temporal component on the high-resolution grid was obtained by following the anomaly method. It is based on the assumption that the spatio-temporal structure of the signal of a

  16. Geostatistics – a tool applied to the distribution of Legionella pneumophila in a hospital water system

    Directory of Open Access Journals (Sweden)

    Pasqualina Laganà

    2015-12-01

    Full Text Available [b]Introduction.[/b] Legionnaires’ disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of [i]Legionella pneumophila[/i] (LP in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. [b]Materials and Method. [/b]LP serogroups 1 and 2–14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a ‘random’ grid of points was chosen and spatial geostatistics or [i]FAIk Kriging[/i] was applied and compared with the results of classical statistical analysis. [b]Results[/b]. Over 50% of the examined samples were positive for [i]Legionella pneumophila[/i]. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2–14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different [i]L. pneumophila[/i] serogroups. [b]Conclusion.[/b] Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.

  17. Magnetic Particle Imaging for High Temporal Resolution Assessment of Aneurysm Hemodynamics.

    Directory of Open Access Journals (Sweden)

    Jan Sedlacik

    Full Text Available The purpose of this work was to demonstrate the capability of magnetic particle imaging (MPI to assess the hemodynamics in a realistic 3D aneurysm model obtained by additive manufacturing. MPI was compared with magnetic resonance imaging (MRI and dynamic digital subtraction angiography (DSA.The aneurysm model was of saccular morphology (7 mm dome height, 5 mm cross-section, 3-4 mm neck, 3.5 mm parent artery diameter and connected to a peristaltic pump delivering a physiological flow (250 mL/min and pulsation rate (70/min. High-resolution (4 h long 4D phase contrast flow quantification (4D pc-fq MRI was used to directly assess the hemodynamics of the model. Dynamic MPI, MRI, and DSA were performed with contrast agent injections (3 mL volume in 3 s through a proximally placed catheter.4D pc-fq measurements showed distinct pulsatile flow velocities (20-80 cm/s as well as lower flow velocities and a vortex inside the aneurysm. All three dynamic methods (MPI, MRI, and DSA also showed a clear pulsation pattern as well as delayed contrast agent dynamics within the aneurysm, which is most likely caused by the vortex within the aneurysm. Due to the high temporal resolution of MPI and DSA, it was possible to track the contrast agent bolus through the model and to estimate the average flow velocity (about 60 cm/s, which is in accordance with the 4D pc-fq measurements.The ionizing radiation free, 4D high resolution MPI method is a very promising tool for imaging and characterization of hemodynamics in human. It carries the possibility of overcoming certain disadvantages of other modalities like considerably lower temporal resolution of dynamic MRI and limited 2D characteristics of DSA. Furthermore, additive manufacturing is the key for translating powerful pre-clinical techniques into the clinic.

  18. Verification of High Resolution Soil Moisture and Latent Heat in Germany

    Science.gov (United States)

    Samaniego, L. E.; Warrach-Sagi, K.; Zink, M.; Wulfmeyer, V.

    2012-12-01

    Improving our understanding of soil-land-surface-atmosphere feedbacks is fundamental to make reliable predictions of water and energy fluxes on land systems influenced by anthropogenic activities. Estimating, for instance, which would be the likely consequences of changing climatic regimes on water availability and crop yield, requires of high resolution soil moisture. Modeling it at large-scales, however, is difficult and uncertain because of the interplay between state variables and fluxes and the significant parameter uncertainty of the predicting models. At larger scales, the sub-grid variability of the variables involved and the nonlinearity of the processes complicate the modeling exercise even further because parametrization schemes might be scale dependent. Two contrasting modeling paradigms (WRF/Noah-MP and mHM) were employed to quantify the effects of model and data complexity on soil moisture and latent heat over Germany. WRF/Noah-MP was forced ERA-interim on the boundaries of the rotated CORDEX-Grid (www.meteo.unican.es/wiki/cordexwrf) with a spatial resolution of 0.11o covering Europe during the period from 1989 to 2009. Land cover and soil texture were represented in WRF/Noah-MP with 1×1~km MODIS images and a single horizon, coarse resolution European-wide soil map with 16 soil texture classes, respectively. To ease comparison, the process-based hydrological model mHM was forced with daily precipitation and temperature fields generated by WRF during the same period. The spatial resolution of mHM was fixed at 4×4~km. The multiscale parameter regionalization technique (MPR, Samaniego et al. 2010) was embedded in mHM to be able to estimate effective model parameters using hyper-resolution input data (100×100~km) obtained from Corine land cover and detailed soil texture fields for various horizons comprising 72 soil texture classes for Germany, among other physiographical variables. mHM global parameters, in contrast with those of Noah-MP, were

  19. The Role of Resolution in the Estimation of Fractal Dimension Maps From SAR Data

    Directory of Open Access Journals (Sweden)

    Gerardo Di Martino

    2017-12-01

    Full Text Available This work is aimed at investigating the role of resolution in fractal dimension map estimation, analyzing the role of the different surface spatial scales involved in the considered estimation process. The study is performed using a data set of actual Cosmo/SkyMed Synthetic Aperture Radar (SAR images relevant to two different areas, the region of Bidi in Burkina Faso and the city of Naples in Italy, acquired in stripmap and enhanced spotlight modes. The behavior of fractal dimension maps in the presence of areas with distinctive characteristics from the viewpoint of land-cover and surface features is discussed. Significant differences among the estimated maps are obtained in the presence of fine textural details, which significantly affect the fractal dimension estimation for the higher resolution spotlight images. The obtained results show that if we are interested in obtaining a reliable estimate of the fractal dimension of the observed natural scene, stripmap images should be chosen in view of both economic and computational considerations. In turn, the combination of fractal dimension maps obtained from stripmap and spotlight images can be used to identify areas on the scene presenting non-fractal behavior (e.g., urban areas. Along this guideline, a simple example of stripmap-spotlight data fusion is also presented.

  20. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery

    Science.gov (United States)

    Malambo, L.; Popescu, S. C.; Murray, S. C.; Putman, E.; Pugh, N. A.; Horne, D. W.; Richardson, G.; Sheridan, R.; Rooney, W. L.; Avant, R.; Vidrine, M.; McCutchen, B.; Baltensperger, D.; Bishop, M.

    2018-02-01

    Plant breeders and agronomists are increasingly interested in repeated plant height measurements over large experimental fields to study critical aspects of plant physiology, genetics and environmental conditions during plant growth. However, collecting such measurements using commonly used manual field measurements is inefficient. 3D point clouds generated from unmanned aerial systems (UAS) images using Structure from Motion (SfM) techniques offer a new option for efficiently deriving in-field crop height data. This study evaluated UAS/SfM for multitemporal 3D crop modelling and developed and assessed a methodology for estimating plant height data from point clouds generated using SfM. High-resolution images in visible spectrum were collected weekly across 12 dates from April (planting) to July (harvest) 2016 over 288 maize (Zea mays L.) and 460 sorghum (Sorghum bicolor L.) plots using a DJI Phantom 3 Professional UAS. The study compared SfM point clouds with terrestrial lidar (TLS) at two dates to evaluate the ability of SfM point clouds to accurately capture ground surfaces and crop canopies, both of which are critical for plant height estimation. Extended plant height comparisons were carried out between SfM plant height (the 90th, 95th, 99th percentiles and maximum height) per plot and field plant height measurements at six dates throughout the growing season to test the repeatability and consistency of SfM estimates. High correlations were observed between SfM and TLS data (R2 = 0.88-0.97, RMSE = 0.01-0.02 m and R2 = 0.60-0.77 RMSE = 0.12-0.16 m for the ground surface and canopy comparison, respectively). Extended height comparisons also showed strong correlations (R2 = 0.42-0.91, RMSE = 0.11-0.19 m for maize and R2 = 0.61-0.85, RMSE = 0.12-0.24 m for sorghum). In general, the 90th, 95th and 99th percentile height metrics had higher correlations to field measurements than the maximum metric though differences among them were not statistically significant. The

  1. Towards high-resolution positron emission tomography for small volumes

    International Nuclear Information System (INIS)

    McKee, B.T.A.

    1982-01-01

    Some arguments are made regarding the medical usefulness of high spatial resolution in positron imaging, even if limited to small imaged volumes. Then the intrinsic limitations to spatial resolution in positron imaging are discussed. The project to build a small-volume, high resolution animal research prototype (SHARP) positron imaging system is described. The components of the system, particularly the detectors, are presented and brief mention is made of data acquisition and image reconstruction methods. Finally, some preliminary imaging results are presented; a pair of isolated point sources and 18 F in the bones of a rabbit. Although the detector system is not fully completed, these first results indicate that the goals of high sensitivity and high resolution (4 mm) have been realized. (Auth.)

  2. Spatio-temporal patterns of Cu contamination in mosses using geostatistical estimation

    International Nuclear Information System (INIS)

    Martins, Anabela; Figueira, Rui; Sousa, António Jorge; Sérgio, Cecília

    2012-01-01

    Several recent studies have reported temporal trends in metal contamination in mosses, but such assessments did not evaluate uncertainty in temporal changes, therefore providing weak statistical support for time comparisons. Furthermore, levels of contaminants in the environment change in both space and time, requiring space-time modelling methods for map estimation. We propose an indicator of spatial and temporal variation based on space-time estimation by indicator kriging, where uncertainty at each location is estimated from the local distribution function, thereby calculating variability intervals for comparison between several biomonitoring dates. This approach was exemplified using copper concentrations in mosses from four Portuguese surveys (1992, 1997, 2002 and 2006). Using this approach, we identified a general decrease in copper contamination, but spatial patterns were not uniform, and from the uncertainty intervals, changes could not be considered significant in the majority of the study area. - Highlights: ► We estimated copper contamination in mosses by spatio-temporal kriging between 1992 and 2006. ► We determined local distribution functions to define variation intervals at each location. ► Significance of temporal changes is assessed using an indicator based on uncertainty interval. ► There is general decrease in copper contamination, but spatial patterns are not uniform. - The contamination of copper in mosses was estimated by spatio-temporal kriging, with determination of uncertainty classes in the temporal variation.

  3. High-resolution X-ray crystal structure of bovine H-protein using the high-pressure cryocooling method

    International Nuclear Information System (INIS)

    Higashiura, Akifumi; Ohta, Kazunori; Masaki, Mika; Sato, Masaru; Inaka, Koji; Tanaka, Hiroaki; Nakagawa, Atsushi

    2013-01-01

    Using the high-pressure cryocooling method, the high-resolution X-ray crystal structure of bovine H-protein was determined at 0.86 Å resolution. This is the first ultra-high-resolution structure obtained from a high-pressure cryocooled crystal. Recently, many technical improvements in macromolecular X-ray crystallography have increased the number of structures deposited in the Protein Data Bank and improved the resolution limit of protein structures. Almost all high-resolution structures have been determined using a synchrotron radiation source in conjunction with cryocooling techniques, which are required in order to minimize radiation damage. However, optimization of cryoprotectant conditions is a time-consuming and difficult step. To overcome this problem, the high-pressure cryocooling method was developed (Kim et al., 2005 ▶) and successfully applied to many protein-structure analyses. In this report, using the high-pressure cryocooling method, the X-ray crystal structure of bovine H-protein was determined at 0.86 Å resolution. Structural comparisons between high- and ambient-pressure cryocooled crystals at ultra-high resolution illustrate the versatility of this technique. This is the first ultra-high-resolution X-ray structure obtained using the high-pressure cryocooling method

  4. High resolution drift chambers

    International Nuclear Information System (INIS)

    Va'vra, J.

    1985-07-01

    High precision drift chambers capable of achieving less than or equal to 50 μm resolutions are discussed. In particular, we compare so called cool and hot gases, various charge collection geometries, several timing techniques and we also discuss some systematic problems. We also present what we would consider an ''ultimate'' design of the vertex chamber. 50 refs., 36 figs., 6 tabs

  5. High resolution neutron spectroscopy for helium isotopes

    International Nuclear Information System (INIS)

    Abdel-Wahab, M.S.; Klages, H.O.; Schmalz, G.; Haesner, B.H.; Kecskemeti, J.; Schwarz, P.; Wilczynski, J.

    1992-01-01

    A high resolution fast neutron time-of-flight spectrometer is described, neutron time-of-flight spectra are taken using a specially designed TDC in connection to an on-line computer. The high time-of-flight resolution of 5 ps/m enabled the study of the total cross section of 4 He for neutrons near the 3/2 + resonance in the 5 He nucleus. The resonance parameters were determined by a single level Breit-Winger fit to the data. (orig.)

  6. Ore reserve evalution, through geostatistical methods, in sector C-09, Pocos de Caldas, MG-Brazil

    International Nuclear Information System (INIS)

    Guerra, P.A.G.; Censi, A.C.; Marques, J.P.M.; Huijbregts, Ch.

    1978-01-01

    In sector C-09, Pocos de Caldas in the state of Minas Gerais, geostatistical techniques have been used to evaluate the tonnage of U 3 O 8 and associated minerals and to delimit ore from sterile areas. The calculation of reserve was based on borehole information including the results of chemical and/or radiometric analysis. Two-and three dimensional evalutions were made following the existing geological models. Initially, the evaluation was based on chemical analysis using the more classical geostatistical technique of kriging. This was followed by a second evaluation using the more recent technique of co-kriging which permited the incorporation of radiometric information in the calculations. The correlation between ore grade and radiometric was studied using the method of cross-covariance. Following restrictions imposed by mining considerations, a probabilistic selection was made of blocks of appropriate dimensions so as to evaluate the grade tonnage curve for each panel. (Author) [pt

  7. A high-resolution regional reanalysis for Europe

    Science.gov (United States)

    Ohlwein, C.

    2015-12-01

    Reanalyses gain more and more importance as a source of meteorological information for many purposes and applications. Several global reanalyses projects (e.g., ERA, MERRA, CSFR, JMA9) produce and verify these data sets to provide time series as long as possible combined with a high data quality. Due to a spatial resolution down to 50-70km and 3-hourly temporal output, they are not suitable for small scale problems (e.g., regional climate assessment, meso-scale NWP verification, input for subsequent models such as river runoff simulations). The implementation of regional reanalyses based on a limited area model along with a data assimilation scheme is able to generate reanalysis data sets with high spatio-temporal resolution. Within the Hans-Ertel-Centre for Weather Research (HErZ), the climate monitoring branch concentrates efforts on the assessment and analysis of regional climate in Germany and Europe. In joint cooperation with DWD (German Meteorological Service), a high-resolution reanalysis system based on the COSMO model has been developed. The regional reanalysis for Europe matches the domain of the CORDEX EURO-11 specifications, albeit at a higher spatial resolution, i.e., 0.055° (6km) instead of 0.11° (12km) and comprises the assimilation of observational data using the existing nudging scheme of COSMO complemented by a special soil moisture analysis with boundary conditions provided by ERA-Interim data. The reanalysis data set covers the past 20 years. Extensive evaluation of the reanalysis is performed using independent observations with special emphasis on precipitation and high-impact weather situations indicating a better representation of small scale variability. Further, the evaluation shows an added value of the regional reanalysis with respect to the forcing ERA Interim reanalysis and compared to a pure high-resolution dynamical downscaling approach without data assimilation.

  8. Geostatistical simulations for radon indoor with a nested model including the housing factor

    International Nuclear Information System (INIS)

    Cafaro, C.; Giovani, C.; Garavaglia, M.

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. - Highlights: • The housing class is inserted into co-kriging via an indicator function. • Inserting the housing classes in a co-kriging improves predictions. • The housing class has a structured component in space. • A nested model is implemented into the multigaussian algorithm. • A collection of risk maps is merged into one to create RPA.

  9. Multivariate Analysis and Modeling of Sediment Pollution Using Neural Network Models and Geostatistics

    Science.gov (United States)

    Golay, Jean; Kanevski, Mikhaïl

    2013-04-01

    The present research deals with the exploration and modeling of a complex dataset of 200 measurement points of sediment pollution by heavy metals in Lake Geneva. The fundamental idea was to use multivariate Artificial Neural Networks (ANN) along with geostatistical models and tools in order to improve the accuracy and the interpretability of data modeling. The results obtained with ANN were compared to those of traditional geostatistical algorithms like ordinary (co)kriging and (co)kriging with an external drift. Exploratory data analysis highlighted a great variety of relationships (i.e. linear, non-linear, independence) between the 11 variables of the dataset (i.e. Cadmium, Mercury, Zinc, Copper, Titanium, Chromium, Vanadium and Nickel as well as the spatial coordinates of the measurement points and their depth). Then, exploratory spatial data analysis (i.e. anisotropic variography, local spatial correlations and moving window statistics) was carried out. It was shown that the different phenomena to be modeled were characterized by high spatial anisotropies, complex spatial correlation structures and heteroscedasticity. A feature selection procedure based on General Regression Neural Networks (GRNN) was also applied to create subsets of variables enabling to improve the predictions during the modeling phase. The basic modeling was conducted using a Multilayer Perceptron (MLP) which is a workhorse of ANN. MLP models are robust and highly flexible tools which can incorporate in a nonlinear manner different kind of high-dimensional information. In the present research, the input layer was made of either two (spatial coordinates) or three neurons (when depth as auxiliary information could possibly capture an underlying trend) and the output layer was composed of one (univariate MLP) to eight neurons corresponding to the heavy metals of the dataset (multivariate MLP). MLP models with three input neurons can be referred to as Artificial Neural Networks with EXternal

  10. Quantitative accuracy of serotonergic neurotransmission imaging with high-resolution 123I SPECT

    International Nuclear Information System (INIS)

    Kuikka, J.T.

    2004-01-01

    Aim: Serotonin transporter (SERT) imaging can be used to study the role of regional abnormalities of neurotransmitter release in various mental disorders and to study the mechanism of action of therapeutic drugs or drugs' abuse. We examine the quantitative accuracy and reproducibility that can be achieved with high-resolution SPECT of serotonergic neurotransmission. Method: Binding potential (BP) of 123 I labeled tracer specific for midbrain SERT was assessed in 20 healthy persons. The effects of scatter, attenuation, partial volume, misregistration and statistical noise were estimated using phantom and human studies. Results: Without any correction, BP was underestimated by 73%. The partial volume error was the major component in this underestimation whereas the most critical error for the reproducibility was misplacement of region of interest (ROI). Conclusion: The proper ROI registration, the use of the multiple head gamma camera with transmission based scatter correction introduce more relevant results. However, due to the small dimensions of the midbrain SERT structures and poor spatial resolution of SPECT, the improvement without the partial volume correction is not great enough to restore the estimate of BP to that of the true one. (orig.) [de

  11. Extended-range high-resolution dynamical downscaling over a continental-scale spatial domain with atmospheric and surface nudging

    Science.gov (United States)

    Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.

    2014-12-01

    Extended-range high-resolution mesoscale simulations with limited-area atmospheric models when applied to downscale regional analysis fields over large spatial domains can provide valuable information for many applications including the weather-dependent renewable energy industry. Long-term simulations over a continental-scale spatial domain, however, require mechanisms to control the large-scale deviations in the high-resolution simulated fields from the coarse-resolution driving fields. As enforcement of the lateral boundary conditions is insufficient to restrict such deviations, large scales in the simulated high-resolution meteorological fields are therefore spectrally nudged toward the driving fields. Different spectral nudging approaches, including the appropriate nudging length scales as well as the vertical profiles and temporal relaxations for nudging, have been investigated to propose an optimal nudging strategy. Impacts of time-varying nudging and generation of hourly analysis estimates are explored to circumvent problems arising from the coarse temporal resolution of the regional analysis fields. Although controlling the evolution of the atmospheric large scales generally improves the outputs of high-resolution mesoscale simulations within the surface layer, the prognostically evolving surface fields can nevertheless deviate from their expected values leading to significant inaccuracies in the predicted surface layer meteorology. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil moisture, and snow conditions, toward their expected values obtained from a high-resolution offline surface scheme is therefore proposed to limit any considerable deviation. Finally, wind speed and temperature at wind turbine hub height predicted by different spectrally nudged extended-range simulations are compared against observations to demonstrate possible improvements achievable using higher spatiotemporal

  12. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  13. Achieving sensitive, high-resolution laser spectroscopy at CRIS

    Energy Technology Data Exchange (ETDEWEB)

    Groote, R. P. de [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Lynch, K. M., E-mail: kara.marie.lynch@cern.ch [EP Department, CERN, ISOLDE (Switzerland); Wilkins, S. G. [The University of Manchester, School of Physics and Astronomy (United Kingdom); Collaboration: the CRIS collaboration

    2017-11-15

    The Collinear Resonance Ionization Spectroscopy (CRIS) experiment, located at the ISOLDE facility, has recently performed high-resolution laser spectroscopy, with linewidths down to 20 MHz. In this article, we present the modifications to the beam line and the newly-installed laser systems that have made sensitive, high-resolution measurements possible. Highlights of recent experimental campaigns are presented.

  14. Use of high-resolution imagery acquired from an unmanned aircraft system for fluvial mapping and estimating water-surface velocity in rivers

    Science.gov (United States)

    Kinzel, P. J.; Bauer, M.; Feller, M.; Holmquist-Johnson, C.; Preston, T.

    2013-12-01

    surveyed in the reaches for detailed cross-section comparisons to the photogrammetric surface model. When operating the T-Hawk in a hover and stare flight pattern, natural tracers (floating algae) were successfully detected for use in estimating water-surface velocity. A subsequent flight with the T-Hawk is planned this fall to evaluate the ability of the UAS system to detect geomorphic change between successive surveys. While the vertical accuracies and the spatial extent of the T-Hawk system are currently less than conventional airborne LiDAR mapping, further technical advancements could increase the accuracy of UAS products providing a relatively low-cost solution for monitoring project-scale river management activities at a high temporal resolution.

  15. An atlas of high-resolution IRAS maps on nearby galaxies

    Science.gov (United States)

    Rice, Walter

    1993-01-01

    An atlas of far-infrared IRAS maps with near 1 arcmin angular resolution of 30 optically large galaxies is presented. The high-resolution IRAS maps were produced with the Maximum Correlation Method (MCM) image construction and enhancement technique developed at IPAC. The MCM technique, which recovers the spatial information contained in the overlapping detector data samples of the IRAS all-sky survey scans, is outlined and tests to verify the structural reliability and photometric integrity of the high-resolution maps are presented. The infrared structure revealed in individual galaxies is discussed. The atlas complements the IRAS Nearby Galaxy High-Resolution Image Atlas, the high-resolution galaxy images encoded in FITS format, which is provided to the astronomical community as an IPAC product.

  16. Evaluation of stationary and non-stationary geostatistical models for inferring hydraulic conductivity values at Aespoe

    International Nuclear Information System (INIS)

    La Pointe, P.R.

    1994-11-01

    This report describes the comparison of stationary and non-stationary geostatistical models for the purpose of inferring block-scale hydraulic conductivity values from packer tests at Aespoe. The comparison between models is made through the evaluation of cross-validation statistics for three experimental designs. The first experiment consisted of a 'Delete-1' test previously used at Finnsjoen. The second test consisted of 'Delete-10%' and the third test was a 'Delete-50%' test. Preliminary data analysis showed that the 3 m and 30 m packer test data can be treated as a sample from a single population for the purposes of geostatistical analyses. Analysis of the 3 m data does not indicate that there are any systematic statistical changes with depth, rock type, fracture zone vs non-fracture zone or other mappable factor. Directional variograms are ambiguous to interpret due to the clustered nature of the data, but do not show any obvious anisotropy that should be accounted for in geostatistical analysis. Stationary analysis suggested that there exists a sizeable spatially uncorrelated component ('Nugget Effect') in the 3 m data, on the order of 60% of the observed variance for the various models fitted. Four different nested models were automatically fit to the data. Results for all models in terms of cross-validation statistics were very similar for the first set of validation tests. Non-stationary analysis established that both the order of drift and the order of the intrinsic random functions is low. This study also suggests that conventional cross-validation studies and automatic variogram fitting are not necessarily evaluating how well a model will infer block scale hydraulic conductivity values. 20 refs, 20 figs, 14 tabs

  17. Los Angeles megacity: a high-resolution land–atmosphere modelling system for urban CO2 emissions

    Directory of Open Access Journals (Sweden)

    S. Feng

    2016-07-01

    Full Text Available Megacities are major sources of anthropogenic fossil fuel CO2 (FFCO2 emissions. The spatial extents of these large urban systems cover areas of 10 000 km2 or more with complex topography and changing landscapes. We present a high-resolution land–atmosphere modelling system for urban CO2 emissions over the Los Angeles (LA megacity area. The Weather Research and Forecasting (WRF-Chem model was coupled to a very high-resolution FFCO2 emission product, Hestia-LA, to simulate atmospheric CO2 concentrations across the LA megacity at spatial resolutions as fine as  ∼  1 km. We evaluated multiple WRF configurations, selecting one that minimized errors in wind speed, wind direction, and boundary layer height as evaluated by its performance against meteorological data collected during the CalNex-LA campaign (May–June 2010. Our results show no significant difference between moderate-resolution (4 km and high-resolution (1.3 km simulations when evaluated against surface meteorological data, but the high-resolution configurations better resolved planetary boundary layer heights and vertical gradients in the horizontal mean winds. We coupled our WRF configuration with the Vulcan 2.2 (10 km resolution and Hestia-LA (1.3 km resolution fossil fuel CO2 emission products to evaluate the impact of the spatial resolution of the CO2 emission products and the meteorological transport model on the representation of spatiotemporal variability in simulated atmospheric CO2 concentrations. We find that high spatial resolution in the fossil fuel CO2 emissions is more important than in the atmospheric model to capture CO2 concentration variability across the LA megacity. Finally, we present a novel approach that employs simultaneous correlations of the simulated atmospheric CO2 fields to qualitatively evaluate the greenhouse gas measurement network over the LA megacity. Spatial correlations in the atmospheric CO2 fields reflect the coverage of

  18. X-ray fluorescence in Member States (Italy): Full field X-ray fluorescence imaging with high-energy and high-spatial resolution

    Energy Technology Data Exchange (ETDEWEB)

    Romano, F. P.; Masini, N.; Pappalardo, L., E-mail: romanop@lns.infn.it [IBAM, CNR, Via Biblioteca 4, 95124 Catania (Italy); Cosentino, L.; Gammino, S.; Mascali, D.; Rizzo, F. [INFN-LNS, Via S. Sofia 62, 95123 Catania (Italy)

    2014-02-15

    A full field X-ray camera for the X-Ray Fluorescence imaging of materials with high-energy and high-spatial resolution was designed and developed. The system was realized by coupling a pinhole collimator with a positionsensitive CCD detector. X-Ray fluorescence is induced on the samples by irradiation with an external X-ray tube. The characteristic X-ray spectra of the investigated materials are obtained by using a multi-frames acquisition in single-photon counting. The energy resolution measured at the Fe-Kα line was 157 eV. The spatial resolution of the system was determined by the analysis of a sharp-edge at different magnification values; it was estimated to be 90 μm at a magnification value of 3.2x and 190 μm at 0.8x. The present set-up of the system is suited to analyze samples with dimensions up to 5x4 cm{sup 2}. Typical measurement time is in the range between 1h to 4 h. (author)

  19. Development of high speed integrated circuit for very high resolution timing measurements

    International Nuclear Information System (INIS)

    Mester, Christian

    2009-10-01

    A multi-channel high-precision low-power time-to-digital converter application specific integrated circuit for high energy physics applications has been designed and implemented in a 130 nm CMOS process. To reach a target resolution of 24.4 ps, a novel delay element has been conceived. This nominal resolution has been experimentally verified with a prototype, with a minimum resolution of 19 ps. To further improve the resolution, a new interpolation scheme has been described. The ASIC has been designed to use a reference clock with the LHC bunch crossing frequency of 40 MHz and generate all required timing signals internally, to ease to use within the framework of an LHC upgrade. Special care has been taken to minimise the power consumption. (orig.)

  20. Development of high speed integrated circuit for very high resolution timing measurements

    Energy Technology Data Exchange (ETDEWEB)

    Mester, Christian

    2009-10-15

    A multi-channel high-precision low-power time-to-digital converter application specific integrated circuit for high energy physics applications has been designed and implemented in a 130 nm CMOS process. To reach a target resolution of 24.4 ps, a novel delay element has been conceived. This nominal resolution has been experimentally verified with a prototype, with a minimum resolution of 19 ps. To further improve the resolution, a new interpolation scheme has been described. The ASIC has been designed to use a reference clock with the LHC bunch crossing frequency of 40 MHz and generate all required timing signals internally, to ease to use within the framework of an LHC upgrade. Special care has been taken to minimise the power consumption. (orig.)