A geostatistical analysis of geostatistics
Hengl, T.; Minasny, B.; Gould, M.
2009-01-01
The bibliometric indices of the scientific field of geostatistics were analyzed using statistical and spatial data analysis. The publications and their citation statistics were obtained from the Web of Science (4000 most relevant), Scopus (2000 most relevant) and Google Scholar (5389). The focus was
Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.
1999-05-01
UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.
Geostatistical methods applied to field model residuals
DEFF Research Database (Denmark)
Maule, Fox; Mosegaard, K.; Olsen, Nils
consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...
Geostatistics and spatial analysis in biological anthropology.
Relethford, John H
2008-05-01
A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology.
Robust geostatistical analysis of spatial data
Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.
2013-04-01
Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R
Geostatistics and Analysis of Spatial Data
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg
2007-01-01
This note deals with geostatistical measures for spatial correlation, namely the auto-covariance function and the semi-variogram, as well as deterministic and geostatistical methods for spatial interpolation, namely inverse distance weighting and kriging. Some semi-variogram models are mentioned......, specifically the spherical, the exponential and the Gaussian models. Equations to carry out simple og ordinary kriging are deduced. Other types of kriging are mentioned, and references to international literature, Internet addresses and state-of-the-art software in the field are given. A very simple example...... to illustrate the computations and a more realistic example with height data from an area near Slagelse, Denmark, are given. Finally, a series of attractive characteristics of kriging are mentioned, and a simple sampling strategic consideration is given based on the dependence of the kriging variance...
Directory of Open Access Journals (Sweden)
Joelmir André Borssoi
2009-12-01
Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram
Geostatistical analysis and kriging of Hexachlorocyclohexane residues in topsoil from Tianjin, China
Energy Technology Data Exchange (ETDEWEB)
Li, B.G. [College of Environmental Sciences, MOE Laboratory for Earth Surface Processes, Peking University, Beijing 100871 (China); Cao, J. [College of Environmental Sciences, MOE Laboratory for Earth Surface Processes, Peking University, Beijing 100871 (China); Liu, W.X. [College of Environmental Sciences, MOE Laboratory for Earth Surface Processes, Peking University, Beijing 100871 (China); Shen, W.R. [Environmental Protection Bureau, Tianjin 300191 (China); Wang, X.J. [College of Environmental Sciences, MOE Laboratory for Earth Surface Processes, Peking University, Beijing 100871 (China); Tao, S. [College of Environmental Sciences, MOE Laboratory for Earth Surface Processes, Peking University, Beijing 100871 (China)]. E-mail: taos@urban.pku.edu.cn
2006-08-15
A previously published data set of HCH isomer concentrations in topsoil samples from Tianjin, China, was subjected to geospatial analysis. Semivariograms were calculated and modeled using geostatistical techniques. Parameters of semivariogram models were analyzed and compared for four HCH isomers. Two-dimensional ordinary block kriging was applied to HCH isomers data set for mapping purposes. Dot maps and gray-scaled raster maps of HCH concentrations were presented based on kriging results. The appropriateness of the kriging procedure for mapping purposes was evaluated based on the kriging errors and kriging variances. It was found that ordinary block kriging can be applied to interpolate HCH concentrations in Tianjin topsoil with acceptable accuracy for mapping purposes. - Geostatistical analysis and kriging were applied to HCH concentrations in topsoil of Tianjin, China for mapping purposes.
Mercury emissions from coal combustion in Silesia, analysis using geostatistics
Zasina, Damian; Zawadzki, Jaroslaw
2015-04-01
Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http
Reducing spatial uncertainty in climatic maps through geostatistical analysis
Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier
2014-05-01
), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.
Directory of Open Access Journals (Sweden)
Juliano de Bastos Pazini
2015-06-01
Full Text Available Tibraca limbativentris (rice stem bug is an insect highly injurious to the rice crop in Brazil. The aim of this research was to define the spatial distribution of the T. limbativentris and improve the sampling process by means of geostatistical application techniques and construction of prediction maps in a flooded rice field located in the "Planalto da Campanha" Region, Rio Grande do Sul (RS, Brazil. The experiments were conducted in rice crop in the municipality of Itaqui - RS, in the crop years of 2009/10, 2010/11 and 2011/12, counting fortnightly the number of nymphs and adults in a georeferenced grid with points spaced at 50m in the first year and in 10m in the another years. It was performed a geostatistical analysis by means adjusting semivariogram and interpolation of numeric data by kriging to verify the spatial dependence and the subsequent mapping population. The results obtained indicated that the rice stem bug, T. limbativentris, has a strong spatial dependence. The prediction maps allow estimating population density of the pest and visualization of the spatial distribution in flooded rice fields, enabling the improvement of the traditional method of sampling for rice stem bug
Combining geostatistics with Moran's I analysis for mapping soil heavy metals in Beijing, China.
Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo
2012-03-01
Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran's I analysis was used to supplement the traditional geostatistics. According to Moran's I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran's I and the standardized Moran's I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran's I analysis was better than traditional geostatistics. Thus, Moran's I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.
Robidoux, P.; Roberge, J.; Urbina Oviedo, C. A.
2011-12-01
The origin of magmatism and the role of the subducted Coco's Plate in the Chichinautzin volcanic field (CVF), Mexico is still a subject of debate. It has been established that mafic magmas of alkali type (subduction) and calc-alkali type (OIB) are produced in the CVF and both groups cannot be related by simple fractional crystallization. Therefore, many geochemical studies have been done, and many models have been proposed. The main goal of the work present here is to provide a new tool for the visualization and interpretation of geochemical data using geostatistics and geospatial analysis techniques. It contains a complete geodatabase built from referred samples over the 2500 km2 area of CVF and its neighbour stratovolcanoes (Popocatepetl, Iztaccihuatl and Nevado de Toluca). From this database, map of different geochemical markers were done to visualise geochemical signature in a geographical manner, to test the statistic distribution with a cartographic technique and highlight any spatial correlations. The distribution and regionalization of the geochemical signatures can be viewed in a two-dimensional space using a specific spatial analysis tools from a Geographic Information System (GIS). The model of spatial distribution is tested with Linear Decrease (LD) and Inverse Distance Weight (IDW) interpolation technique because they best represent the geostatistical characteristics of the geodatabase. We found that ratio of Ba/Nb, Nb/Ta, Th/Nb show first order tendency, which means visible spatial variation over a large scale area. Monogenetic volcanoes in the center of the CVF have distinct values compare to those of the Popocatepetl-Iztaccihuatl polygenetic complex which are spatially well defined. Inside the Valley of Mexico, a large quantity of monogenetic cone in the eastern portion of CVF has ratios similar to the Iztaccihuatl and Popocatepetl complex. Other ratios like alkalis vs SiO2, V/Ti, La/Yb, Zr/Y show different spatial tendencies. In that case, second
Bayesian Analysis of Geostatistical Models With an Auxiliary Lattice
Park, Jincheol
2012-04-01
The Gaussian geostatistical model has been widely used for modeling spatial data. However, this model suffers from a severe difficulty in computation: it requires users to invert a large covariance matrix. This is infeasible when the number of observations is large. In this article, we propose an auxiliary lattice-based approach for tackling this difficulty. By introducing an auxiliary lattice to the space of observations and defining a Gaussian Markov random field on the auxiliary lattice, our model completely avoids the requirement of matrix inversion. It is remarkable that the computational complexity of our method is only O(n), where n is the number of observations. Hence, our method can be applied to very large datasets with reasonable computational (CPU) times. The numerical results indicate that our model can approximate Gaussian random fields very well in terms of predictions, even for those with long correlation lengths. For real data examples, our model can generally outperform conventional Gaussian random field models in both prediction errors and CPU times. Supplemental materials for the article are available online. © 2012 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China
Directory of Open Access Journals (Sweden)
Bao-Guo Li
2012-03-01
Full Text Available Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.
A Practical Primer on Geostatistics
Olea, Ricardo A.
2009-01-01
significant methodological implications. HISTORICAL REMARKS As a discipline, geostatistics was firmly established in the 1960s by the French engineer Georges Matheron, who was interested in the appraisal of ore reserves in mining. Geostatistics did not develop overnight. Like other disciplines, it has built on previous results, many of which were formulated with different objectives in various fields. PIONEERS Seminal ideas conceptually related to what today we call geostatistics or spatial statistics are found in the work of several pioneers, including: 1940s: A.N. Kolmogorov in turbulent flow and N. Wiener in stochastic processing; 1950s: D. Krige in mining; 1960s: B. Mathern in forestry and L.S. Gandin in meteorology CALCULATIONS Serious applications of geostatistics require the use of digital computers. Although for most geostatistical techniques rudimentary implementation from scratch is fairly straightforward, coding programs from scratch is recommended only as part of a practice that may help users to gain a better grasp of the formulations. SOFTWARE For professional work, the reader should employ software packages that have been thoroughly tested to handle any sampling scheme, that run as efficiently as possible, and that offer graphic capabilities for the analysis and display of results. This primer employs primarily the package Stanford Geomodeling Software (SGeMS) - recently developed at the Energy Resources Engineering Department at Stanford University - as a way to show how to obtain results practically. This applied side of the primer should not be interpreted as the notes being a manual for the use of SGeMS. The main objective of the primer is to help the reader gain an understanding of the fundamental concepts and tools in geostatistics. ORGANIZATION OF THE PRIMER The chapters of greatest importance are those covering kriging and simulation. All other materials are peripheral and are included for better comprehension of th
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...
Institute of Scientific and Technical Information of China (English)
ZENG ZhaoCheng; LEI LiPing; GUO LiJie; ZHANG Li; ZHANG Bing
2013-01-01
Observations of atmospheric carbon dioxide (CO2) from satellites offer new data sources to understand global carbon cycling.The correlation structure of satellite-observed CO2 can be analyzed and modeled by geostatistical methods,and CO2 values at unsampled locations can be predicted with a correlation model.Conventional geostatistical analysis only investigates the spatial correlation of CO2,and does not consider temporal variation in the satellite-observed CO2 data.In this paper,a spatiotemporal geostatistical method that incorporates temporal variability is implemented and assessed for analyzing the spatiotemporal correlation structure and prediction of monthly CO2 in China.The spatiotemporal correlation is estimated and modeled by a product-sum variogram model with a global nugget component.The variogram result indicates a significant degree of temporal correlation within satellite-observed CO2 data sets in China.Prediction of monthly CO2 using the spatiotemporal variogram model and spacetime kriging procedure is implemented.The prediction is compared with a spatial-only geostatistical prediction approach using a cross-validation technique.The spatiotemporal approach gives better results,with higher correlation coefficient (r2),and less mean absolute prediction error and root mean square error.Moreover,the monthly mapping result generated from the spatiotemporal approach has less prediction uncertainty and more detailed spatial variation of CO2 than those from the spatial-only approach.
Energy Technology Data Exchange (ETDEWEB)
Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.
1982-09-01
The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given.
Lanczos, Cornelius
2010-01-01
Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.
Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments
Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.
2015-12-01
The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide
Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.
2015-04-01
Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial
Geostatistical analysis of soil properties at field scale using standardized data
Millan, H.; Tarquis, A. M.; Pérez, L. D.; Matos, J.; González-Posada, M.
2012-04-01
Indentifying areas with physical degradation is a crucial step to ameliorate the effects in soil erosion. The quantification and interpretation of spatial variability is a key issue for site-specific soil management. Geostatistics has been the main methodological tool for implementing precision agriculture using field data collected at different spatial resolutions. Even though many works have made significant contributions to the body of knowledge on spatial statistics and its applications, some other key points need to be addressed for conducting precise comparisons between soil properties using geostatistical parameters. The objectives of the present work were (i) to quantify the spatial structure of different physical properties collected from a Vertisol, (ii) to search for potential correlations between different spatial patterns and (iii) to identify relevant components through multivariate spatial analysis. The study was conducted on a Vertisol (Typic Hapludert) dedicated to sugarcane (Saccharum officinarum L.) production during the last sixty years. We used six soil properties collected from a squared grid (225 points) (penetrometer resistance (PR), total porosity, fragmentation dimension (Df), vertical electrical conductivity (ECv), horizontal electrical conductivity (ECh) and soil water content (WC)). All the original data sets were z-transformed before geostatistical analysis. Three different types of semivariogram models were necessary for fitting individual experimental semivariograms. This suggests the different natures of spatial variability patterns. Soil water content rendered the largest nugget effect (C0 = 0.933) while soil total porosity showed the largest range of spatial correlation (A = 43.92 m). The bivariate geostatistical analysis also rendered significant cross-semivariance between different paired soil properties. However, four different semivariogram models were required in that case. This indicates an underlying co
Bellotti, F.; Capra, L.; Sarocchi, D.; D'Antonio, M.
2010-03-01
Grain size analysis of volcaniclastic deposits is mainly used to study flow transport and depositional processes, in most cases by comparing some statistical parameters and how they change with distance from the source. In this work the geospatial and multivariate analyses are presented as a strong adaptable geostatistical tool applied to volcaniclastic deposits in order to provide an effective and relatively simple methodology for texture description, deposit discrimination and interpretation of depositional processes. We choose the case of Nevado de Toluca volcano (Mexico) due to existing knowledge of its geological evolution, stratigraphic succession and spatial distribution of volcaniclastic units. Grain size analyses and frequency distribution curves have been carried out to characterize and compare the 28-ka block-and-ash flow deposit associated to a dome destruction episode, and the El Morral debris avalanche deposit originated from the collapse of the south-eastern sector of the volcano. The geostatistical interpolation of sedimentological data allows to realize bidimensional maps draped over the volcano topography, showing the granulometric distribution, sorting and fine material concentration into the whole deposit with respect to topographic changes. In this way, it is possible to analyze a continuous surface of the grain size distribution of volcaniclastic deposits and better understand flow transport processes. The application of multivariate statistic analysis (discriminant function) indicates that this methodology could be useful in discriminating deposits with different origin or different depositional lithofacies within the same deposit. The proposed methodology could be an interesting approach to sustain more classical analysis of volcaniclastic deposits, especially where a clear field classification appears problematic because of a homogeneous texture of the deposits or their scarce and discontinuous outcrops. Our study is an example of the
Geostatistical Analysis on the Temporal Patterns of the Yellow Rice Borer, Tryporyza incertulas
Institute of Scientific and Technical Information of China (English)
YUAN Zhe-ming; WANG Zhi; HU Xiang-yue
2005-01-01
In order to comprehend temporal pattern of the larvae population of the yellow rice borer, Tryporyza incertulas, and provide valuable information for its forecast model, the data series of the population for each generation and the over-wintered larvae from 1960 to 1990 in Dingcheng District, Changde City, Hunan Province, were analyzed with geostatistics. The data series of total number,the 1 st generation, the 3rd generation and the over-wintered larvae year to year displayed rather better autocorrelation and prediction.The data series of generation to generation, the 2nd generation and the 4th generation year to year, however, demonstrated poor autocorrelation, especially for the 4th generation, whose autocorrelation degree was zero. The population dynamics of the yellow rice borer was obviously intermittent. A remarkable cycle of four generations, one year, was observed in the population of generation to generation. Omitting the certain generation or interposing the over-wintered larvae only resulted in a less or slight change of autocorrelation of the whole data series generation to generation. Crop system, food, climate and natural enemies, therefore, played more important roles in regulating the population dynamics than base number of the larvae. The basic techniques of geostatistics applied in analyzing temporal population dynamics were outlined.
Geostatistical analysis of soil moisture distribution in a part of Solani River catchment
Kumar, Kamal; Arora, M. K.; Hariprasad, K. S.
2016-03-01
The aim of this paper is to estimate soil moisture at spatial level by applying geostatistical techniques on the point observations of soil moisture in parts of Solani River catchment in Haridwar district of India. Undisturbed soil samples were collected at 69 locations with soil core sampler at a depth of 0-10 cm from the soil surface. Out of these, discrete soil moisture observations at 49 locations were used to generate a spatial soil moisture distribution map of the region. Two geostatistical techniques, namely, moving average and kriging, were adopted. Root mean square error (RMSE) between observed and estimated soil moisture at remaining 20 locations was determined to assess the accuracy of the estimated soil moisture. Both techniques resulted in low RMSE at small limiting distance, which increased with the increase in the limiting distance. The root mean square error varied from 7.42 to 9.77 in moving average method, while in case of kriging it varied from 7.33 to 9.99 indicating similar performance of the two techniques.
3D Geostatistical Modeling and Uncertainty Analysis in a Carbonate Reservoir, SW Iran
Directory of Open Access Journals (Sweden)
Mohammad Reza Kamali
2013-01-01
Full Text Available The aim of geostatistical reservoir characterization is to utilize wide variety of data, in different scales and accuracies, to construct reservoir models which are able to represent geological heterogeneities and also quantifying uncertainties by producing numbers of equiprobable models. Since all geostatistical methods used in estimation of reservoir parameters are inaccurate, modeling of “estimation error” in form of uncertainty analysis is very important. In this paper, the definition of Sequential Gaussian Simulation has been reviewed and construction of stochastic models based on it has been discussed. Subsequently ranking and uncertainty quantification of those stochastically populated equiprobable models and sensitivity study of modeled properties have been presented. Consequently, the application of sensitivity analysis on stochastic models of reservoir horizons, petrophysical properties, and stochastic oil-water contacts, also their effect on reserve, clearly shows any alteration in the reservoir geometry has significant effect on the oil in place. The studied reservoir is located at carbonate sequences of Sarvak Formation, Zagros, Iran; it comprises three layers. The first one which is located beneath the cap rock contains the largest portion of the reserve and other layers just hold little oil. Simulations show that average porosity and water saturation of the reservoir is about 20% and 52%, respectively.
geoCount: An R Package for the Analysis of Geostatistical Count Data
Directory of Open Access Journals (Sweden)
Liang Jing
2015-02-01
Full Text Available We describe the R package geoCount for the analysis of geostatistical count data. The package performs Bayesian analysis for the Poisson-lognormal and binomial-logitnormal spatial models, which are subclasses of the class of generalized linear spatial models proposed by Diggle, Tawn, and Moyeed (1998. The package implements the computational intensive tasks in C++ using an R/C++ interface, and has parallel computation capabilities to speed up the computations. geoCount also implements group updating, Langevin- Hastings algorithms and a data-based parameterization, algorithmic approaches proposed by Christensen, Roberts, and Sko ?ld (2006 to improve the efficiency of the Markov chain Monte Carlo algorithms. In addition, the package includes functions for simulation and visualization, as well as three geostatistical count datasets taken from the literature. One of those is used to illustrate the package capabilities. Finally, we provide a side-by-side comparison between geoCount and the R packages geoRglm and INLA.
4th International Geostatistics Congress
1993-01-01
The contributions in this book were presented at the Fourth International Geostatistics Congress held in Tróia, Portugal, in September 1992. They provide a comprehensive account of the current state of the art of geostatistics, including recent theoretical developments and new applications. In particular, readers will find descriptions and applications of the more recent methods of stochastic simulation together with data integration techniques applied to the modelling of hydrocabon reservoirs. In other fields there are stationary and non-stationary geostatistical applications to geology, climatology, pollution control, soil science, hydrology and human sciences. The papers also provide an insight into new trends in geostatistics particularly the increasing interaction with many other scientific disciplines. This book is a significant reference work for practitioners of geostatistics both in academia and industry.
Directory of Open Access Journals (Sweden)
Goovaerts Pierre
2005-12-01
Full Text Available Abstract Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight, the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the
Muthusamy, Manoranjan; Schellart, Alma; Tait, Simon; Heuvelink, Gerard B. M.
2017-02-01
In this study we develop a method to estimate the spatially averaged rainfall intensity together with associated level of uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 m × 200 m urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of catchment average rainfall intensity is obtained for multiple combinations of intensity ranges and temporal averaging intervals. The two main challenges addressed in this study are scarcity of rainfall measurement locations and non-normality of rainfall data, both of which need to be considered when adopting a geostatistical approach. Scarcity of measurement points is dealt with by pooling sample variograms of repeated rainfall measurements with similar characteristics. Normality of rainfall data is achieved through the use of normal score transformation. Geostatistical models in the form of variograms are derived for transformed rainfall intensity. Next spatial stochastic simulation which is robust to nonlinear data transformation is applied to produce realisations of rainfall fields. These realisations in transformed space are first back-transformed and next spatially aggregated to derive a random sample of the spatially averaged rainfall intensity. Results show that the prediction uncertainty comes mainly from two sources: spatial variability of rainfall and measurement error. At smaller temporal averaging intervals both these effects are high, resulting in a relatively high uncertainty in prediction. With longer temporal averaging intervals the uncertainty becomes lower due to stronger spatial correlation of rainfall data and relatively smaller measurement error. Results also show that the measurement error increases with decreasing rainfall intensity resulting in a higher
Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems
Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros
2015-04-01
In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the
Diaz-Lacava, A. N.; Walier, M; D. Holler; Steffens, M; Gieger, C; C. Furlanello; Lamina, C; Wichmann, H E; Becker, T
2015-01-01
Aiming to investigate fine-scale patterns of genetic heterogeneity in modern humans from a geographic perspective, a genetic geostatistical approach framed within a geographic information system is presented. A sample collected for prospective studies in a small area of southern Germany was analyzed. None indication of genetic heterogeneity was detected in previous analysis. Socio-demographic and genotypic data of German citizens were analyzed (212 SNPs; n = 728). Genetic heterogeneity was ev...
Wahid, Ali; Salim, Ahmed Mohamed Ahmed; Gaafar, Gamal Ragab; Yusoff, Wan Ismail Wan
2016-02-01
Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.
Spatial analysis of lettuce downy mildew using geostatistics and geographic information systems.
Wu, B M; van Bruggen, A H; Subbarao, K V; Pennings, G G
2001-02-01
ABSTRACT The epidemiology of lettuce downy mildew has been investigated extensively in coastal California. However, the spatial patterns of the disease and the distance that Bremia lactucae spores can be transported have not been determined. During 1995 to 1998, we conducted several field- and valley-scale surveys to determine spatial patterns of this disease in the Salinas valley. Geostatistical analyses of the survey data at both scales showed that the influence range of downy mildew incidence at one location on incidence at other locations was between 80 and 3,000 m. A linear relationship was detected between semivariance and lag distance at the field scale, although no single statistical model could fit the semi-variograms at the valley scale. Spatial interpolation by the inverse distance weighting method with a power of 2 resulted in plausible estimates of incidence throughout the valley. Cluster analysis in geographic information systems on the interpolated disease incidence from different dates demonstrated that the Salinas valley could be divided into two areas, north and south of Salinas City, with high and low disease pressure, respectively. Seasonal and spatial trends along the valley suggested that the distinction between the downy mildew conducive and nonconducive areas might be determined by environmental factors.
Institute of Scientific and Technical Information of China (English)
2016-01-01
Characterizing spatial and temporal variability of soil salinity is tremendously important for a variety of agronomic andenvironmental concerns in arid irrigation areas. This paper reviews the characteristics and spatial and temporal variationsof soil salinization in the Ili River Irrigation Area by applying a geostatistical approach. Results showed that： （1） the soilsalinity varied widely, with maximum value of 28.10 g/kg and minimum value of 0.10 g/kg, and was distributed mainly atthe surface soil layer. Anions were mainly SO42- and Cl-, while cations were mainly Na＋ and Ca2＋; （2） the abundance ofsalinity of the root zone soil layer for different land use types was in the following order： grassland 〉 cropland 〉 forestland.The abundance of salinity of root zone soil layers for different periods was in the following order： March 〉 June 〉 September;（3） the spherical model was the most suitable variogram model to describe the salinity of the 0-3 cm and 3-20 cmsoil layers in March and June, and the 3-20 cm soil layer in September, while the exponential model was the most suitablevariogram model to describe the salinity of the 0-3 cm soil layer in September. Relatively strong spatial and temporalstructure existed for soil salinity due to lower nugget effects; and （4） the maps of kriged soil salinity showed that higher soilsalinity was distributed in the central parts of the study area and lower soil salinity was distributed in the marginal parts.Soil salinity tended to increase from the marginal parts to the central parts across the study area. Applying the krigingmethod is very helpful in detecting the problematic areas and is a good tool for soil resources management. Managingefforts on the appropriate use of soil and water resources in such areas is very important for sustainable agriculture, andmore attention should be paid to these areas to prevent future problems.
Institute of Scientific and Technical Information of China (English)
Mamattursun Eziz; Mihrigul Anwar; XinGuo Li
2016-01-01
Characterizing spatial and temporal variability of soil salinity is tremendously important for a variety of agronomic and environmental concerns in arid irrigation areas. This paper reviews the characteristics and spatial and temporal variations of soil salinization in the Ili River Irrigation Area by applying a geostatistical approach. Results showed that: (1) the soil salinity varied widely, with maximum value of 28.10 g/kg and minimum value of 0.10 g/kg, and was distributed mainly at the surface soil layer. Anions were mainly SO42− and Cl−, while cations were mainly Na+and Ca2+; (2) the abundance of salinity of the root zone soil layer for different land use types was in the following order: grassland > cropland > forestland. The abundance of salinity of root zone soil layers for different periods was in the following order: March > June > Sep-tember; (3) the spherical model was the most suitable variogram model to describe the salinity of the 0–3 cm and 3–20 cm soil layers in March and June, and the 3–20 cm soil layer in September, while the exponential model was the most suitable variogram model to describe the salinity of the 0–3 cm soil layer in September. Relatively strong spatial and temporal structure existed for soil salinity due to lower nugget effects; and (4) the maps of kriged soil salinity showed that higher soil salinity was distributed in the central parts of the study area and lower soil salinity was distributed in the marginal parts. Soil salinity tended to increase from the marginal parts to the central parts across the study area. Applying the kriging method is very helpful in detecting the problematic areas and is a good tool for soil resources management. Managing efforts on the appropriate use of soil and water resources in such areas is very important for sustainable agriculture, and more attention should be paid to these areas to prevent future problems.
Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong
2016-01-01
Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.
Karimzadeh, R; Hejazi, M J; Helali, H; Iranipour, S; Mohammadi, S A
2011-10-01
Eurygaster integriceps Puton (Hemiptera: Scutelleridae) is the most serious insect pest of wheat (Triticum aestivum L.) and barley (Hordeum vulgare L.) in Iran. In this study, spatio-temporal distribution of this pest was determined in wheat by using spatial analysis by distance indices (SADIE) and geostatistics. Global positioning and geographic information systems were used for spatial sampling and mapping the distribution of this insect. The study was conducted for three growing seasons in Gharamalek, an agricultural region to the west of Tabriz, Iran. Weekly sampling began when E. integriceps adults migrated to wheat fields from overwintering sites and ended when the new generation adults appeared at the end of season. The adults were sampled using 1- by 1-m quadrat and distance-walk methods. A sweep net was used for sampling the nymphs, and five 180° sweeps were considered as the sampling unit. The results of spatial analyses by using geostatistics and SADIE indicated that E. integriceps adults were clumped after migration to fields and had significant spatial dependency. The second- and third-instar nymphs showed aggregated spatial structure in the middle of growing season. At the end of the season, population distribution changed toward random or regular patterns; and fourth and fifth instars had weaker spatial structure compared with younger nymphs. In Iran, management measures for E. integriceps in wheat fields are mainly applied against overwintering adults, as well as second and third instars. Because of the aggregated distribution of these life stages, site-specific spraying of chemicals is feasible in managing E. integriceps.
Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.
Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale
2016-08-01
Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty.
Geostatistics and GIS: tools for characterizing environmental contamination.
Henshaw, Shannon L; Curriero, Frank C; Shields, Timothy M; Glass, Gregory E; Strickland, Paul T; Breysse, Patrick N
2004-08-01
Geostatistics is a set of statistical techniques used in the analysis of georeferenced data that can be applied to environmental contamination and remediation studies. In this study, the 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) contamination at a Superfund site in western Maryland is evaluated. Concern about the site and its future clean up has triggered interest within the community because residential development surrounds the area. Spatial statistical methods, of which geostatistics is a subset, are becoming increasingly popular, in part due to the availability of geographic information system (GIS) software in a variety of application packages. In this article, the joint use of ArcGIS software and the R statistical computing environment are demonstrated as an approach for comprehensive geostatistical analyses. The spatial regression method, kriging, is used to provide predictions of DDE levels at unsampled locations both within the site and the surrounding areas where residential development is ongoing.
GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...
Energy Technology Data Exchange (ETDEWEB)
Lee, K.H.
1997-09-01
Numerical and geostatistical analyses show that the artificial smoothing effect of kriging removes high permeability flow paths from hydrogeologic data sets, reducing simulated contaminant transport rates in heterogeneous vadose zone systems. therefore, kriging alone is not recommended for estimating the spatial distribution of soil hydraulic properties for contaminant transport analysis at vadose zone sites. Vadose zone transport if modeled more effectively by combining kriging with stochastic simulation to better represent the high degree of spatial variability usually found in the hydraulic properties of field soils. However, kriging is a viable technique for estimating the initial mass distribution of contaminants in the subsurface.
Directory of Open Access Journals (Sweden)
A. N. Diaz-Lacava
2015-01-01
Full Text Available Aiming to investigate fine-scale patterns of genetic heterogeneity in modern humans from a geographic perspective, a genetic geostatistical approach framed within a geographic information system is presented. A sample collected for prospective studies in a small area of southern Germany was analyzed. None indication of genetic heterogeneity was detected in previous analysis. Socio-demographic and genotypic data of German citizens were analyzed (212 SNPs; n=728. Genetic heterogeneity was evaluated with observed heterozygosity (HO. Best-fitting spatial autoregressive models were identified, using socio-demographic variables as covariates. Spatial analysis included surface interpolation and geostatistics of observed and predicted patterns. Prediction accuracy was quantified. Spatial autocorrelation was detected for both socio-demographic and genetic variables. Augsburg City and eastern suburban areas showed higher HO values. The selected model gave best predictions in suburban areas. Fine-scale patterns of genetic heterogeneity were observed. In accordance to literature, more urbanized areas showed higher levels of admixture. This approach showed efficacy for detecting and analyzing subtle patterns of genetic heterogeneity within small areas. It is scalable in number of loci, even up to whole-genome analysis. It may be suggested that this approach may be applicable to investigate the underlying genetic history that is, at least partially, embedded in geographic data.
Model Selection for Geostatistical Models
Energy Technology Data Exchange (ETDEWEB)
Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.
2006-02-01
We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Diggle, Peter J
2007-01-01
Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.
Hevesi, Joseph A.; Istok, Jonathan D.; Flint, Alan L.
1992-01-01
Values of average annual precipitation (AAP) are desired for hydrologic studies within a watershed containing Yucca Mountain, Nevada, a potential site for a high-level nuclear-waste repository. Reliable values of AAP are not yet available for most areas within this watershed because of a sparsity of precipitation measurements and the need to obtain measurements over a sufficient length of time. To estimate AAP over the entire watershed, historical precipitation data and station elevations were obtained from a network of 62 stations in southern Nevada and southeastern California. Multivariate geostatistics (cokriging) was selected as an estimation method because of a significant (p = 0.05) correlation of r = .75 between the natural log of AAP and station elevation. A sample direct variogram for the transformed variable, TAAP = ln [(AAP) 1000], was fitted with an isotropic, spherical model defined by a small nugget value of 5000, a range of 190 000 ft, and a sill value equal to the sample variance of 163 151. Elevations for 1531 additional locations were obtained from topographic maps to improve the accuracy of cokriged estimates. A sample direct variogram for elevation was fitted with an isotropic model consisting of a nugget value of 5500 and three nested transition structures: a Gaussian structure with a range of 61 000 ft, a spherical structure with a range of 70 000 ft, and a quasi-stationary, linear structure. The use of an isotropic, stationary model for elevation was considered valid within a sliding-neighborhood radius of 120 000 ft. The problem of fitting a positive-definite, nonlinear model of coregionalization to an inconsistent sample cross variogram for TAAP and elevation was solved by a modified use of the Cauchy-Schwarz inequality. A selected cross-variogram model consisted of two nested structures: a Gaussian structure with a range of 61 000 ft and a spherical structure with a range of 190 000 ft. Cross validation was used for model selection and for
Directory of Open Access Journals (Sweden)
Ibrahim Hassan
2016-06-01
Full Text Available Various geostatistical and deterministic techniques were used to analyse the spatial variations of groundwater depths. Two different geostatistical methods of ordinary kriging and co-kriging with four semivariogram models, spherical, exponential, circular, Gaussian, and four deterministic methods which are inverse distance weighted (IDW, global polynomial interpolation (GPI, local Polynomial Interpolation (LPI, radial basis function (RBF were used for the estimation of groundwater depths. The study area is in the three Northwestern districts of Bangladesh. Groundwater depth data were recorded from 132 observation wells in the study area over a period of 6 years (2004 to 2009 was considered for the analysis. The spatial interpolation of groundwater depths was then performed using the best-fit model which is geostatistical model selected by comparing the observed RMSE values predicted by the geostatistical and deterministic models and the empirical semi-variogram models. Out of the four semi-variogram models, spherical semi-variogram with cokriging model was considered as the best fitted model for the study area. Result of sensitivity analysis conducted on the input parameters shows that inputs have a strong influence on groundwater levels and the statistical indicators of RMSE and ME suggest that the Co-kriging work best with percolation in predicting the average groundwater table of the study area.
Xia, Peng-Liang; Wang, Rui; Tan, Jun
2014-03-01
Tobacco budworm (Helicoverpa assulta) larvae feed on tobacco leaves (Nicotiana sp.), resulting in significant loss in tobacco production. Geostatistical method was used to analyze H. assulta spatial patterns and dynamics in this paper. The results showed that, H. assulta larvae appeared 40 days after the tobacco plants transplanting, and reached its peak at the early-mature period. The nested spherical and exponential model was the major model for tobacco budworm larva in the field, suggesting its aggregated distribution. The spatial variability C/(C0 + C) was larger than 0.75, which indicated H. assulta larva had wider structural variation and narrower random variation. There was a massive migration of tobacco budworm larva in the fast-growing stage of tobacco. Its quantity became stable after that, especially at the mature stage of tobacco.
Zhang, Rong; Leng, Yun-fa; Zhu, Meng-meng; Wang, Fang
2007-11-01
Based on geographic information system and geostatistics, the spatial structure of Therioaphis trifolii population of different periods in Yuanzhou district of Guyuan City, the southern Ningxia Province, was analyzed. The spatial distribution of Therioaphis trifolii population was also simulated by ordinary Kriging interpretation. The results showed that Therioaphis trifolii population of different periods was correlated spatially in the study area. The semivariograms of Therioaphis trifolii could be described by exponential model, indicating an aggregated spatial arrangement. The spatial variance varied from 34.13%-48.77%, and the range varied from 8.751-12.049 km. The degree and direction of aggregation showed that the trend was increased gradually from southwest to northeast. The dynamic change of Therioaphis trifolii population in different periods could be analyzed intuitively on the simulated maps of the spatial distribution from the two aspects of time and space, The occurrence position and degree of Therioaphis trifolii to a state of certain time could be determined easily.
Schröder, Winfried
2006-05-01
By the example of environmental monitoring, some applications of geographic information systems (GIS), geostatistics, metadata banking, and Classification and Regression Trees (CART) are presented. These tools are recommended for mapping statistically estimated hot spots of vectors and pathogens. GIS were introduced as tools for spatially modelling the real world. The modelling can be done by mapping objects according to the spatial information content of data. Additionally, this can be supported by geostatistical and multivariate statistical modelling. This is demonstrated by the example of modelling marine habitats of benthic communities and of terrestrial ecoregions. Such ecoregionalisations may be used to predict phenomena based on the statistical relation between measurements of an interesting phenomenon such as, e.g., the incidence of medically relevant species and correlated characteristics of the ecoregions. The combination of meteorological data and data on plant phenology can enhance the spatial resolution of the information on climate change. To this end, meteorological and phenological data have to be correlated. To enable this, both data sets which are from disparate monitoring networks have to be spatially connected by means of geostatistical estimation. This is demonstrated by the example of transformation of site-specific data on plant phenology into surface data. The analysis allows for spatial comparison of the phenology during the two periods 1961-1990 and 1991-2002 covering whole Germany. The changes in both plant phenology and air temperature were proved to be statistically significant. Thus, they can be combined by GIS overlay technique to enhance the spatial resolution of the information on the climate change and use them for the prediction of vector incidences at the regional scale. The localisation of such risk hot spots can be done by geometrically merging surface data on promoting factors. This is demonstrated by the example of the
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Directory of Open Access Journals (Sweden)
Goovaerts Pierre
2006-02-01
Full Text Available Abstract Background Smoothing methods have been developed to improve the reliability of risk cancer estimates from sparsely populated geographical entities. Filtering local details of the spatial variation of the risk leads however to the detection of larger clusters of low or high cancer risk while most spatial outliers are filtered out. Static maps of risk estimates and the associated prediction variance also fail to depict the uncertainty attached to the spatial distribution of risk values and does not allow its propagation through local cluster analysis. This paper presents a geostatistical methodology to generate multiple realizations of the spatial distribution of risk values. These maps are then fed into spatial operators, such as in local cluster analysis, allowing one to assess how risk spatial uncertainty translates into uncertainty about the location of spatial clusters and outliers. This novel approach is applied to age-adjusted breast and pancreatic cancer mortality rates recorded for white females in 295 US counties of the Northeast (1970–1994. A public-domain executable with example datasets is provided. Results Geostatistical simulation generates risk maps that are more variable than the smooth risk map estimated by Poisson kriging and reproduce better the spatial pattern captured by the risk semivariogram model. Local cluster analysis of the set of simulated risk maps leads to a clear visualization of the lower reliability of the classification obtained for pancreatic cancer versus breast cancer: only a few counties in the large cluster of low risk detected in West Virginia and Southern Pennsylvania are significant over 90% of all simulations. On the other hand, the cluster of high breast cancer mortality in Niagara county, detected after application of Poisson kriging, appears on 60% of simulated risk maps. Sensitivity analysis shows that 500 realizations are needed to achieve a stable classification for pancreatic cancer
Energy Technology Data Exchange (ETDEWEB)
Mundim, Evaldo Cesario; Johann, Paulo R. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Remacre, Armando Zaupa [Universidade Estadual de Campinas, SP (Brazil)
1999-07-01
In this work the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes as a secondary variable in Kriging system is discussed. Results prove that Factorial Kriging is an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)
Directory of Open Access Journals (Sweden)
Jay Ram Lamichhane
Full Text Available Incidence of Xanthomonas arboricola pv. corylina, the causal agent of hazelnut bacterial blight, was analyzed spatially in relation to the pedoclimatic factors. Hazelnut grown in twelve municipalities situated in the province of Viterbo, central Italy was studied. A consistent number of bacterial isolates were obtained from the infected tissues of hazelnut collected in three years (2010-2012. The isolates, characterized by phenotypic tests, did not show any difference among them. Spatial patterns of pedoclimatic data, analyzed by geostatistics showed a strong positive correlation of disease incidence with higher values of rainfall, thermal shock and soil nitrogen; a weak positive correlation with soil aluminium content and a strong negative correlation with the values of Mg/K ratio. No correlation of the disease incidence was found with soil pH. Disease incidence ranged from very low (<1% to very high (almost 75% across the orchards. Young plants (4-year old were the most affected by the disease confirming a weak negative correlation of the disease incidence with plant age. Plant cultivars did not show any difference in susceptibility to the pathogen. Possible role of climate change on the epidemiology of the disease is discussed. Improved management practices are recommended for effective control of the disease.
Lamichhane, Jay Ram; Fabi, Alfredo; Ridolfi, Roberto; Varvaro, Leonardo
2013-01-01
Incidence of Xanthomonas arboricola pv. corylina, the causal agent of hazelnut bacterial blight, was analyzed spatially in relation to the pedoclimatic factors. Hazelnut grown in twelve municipalities situated in the province of Viterbo, central Italy was studied. A consistent number of bacterial isolates were obtained from the infected tissues of hazelnut collected in three years (2010-2012). The isolates, characterized by phenotypic tests, did not show any difference among them. Spatial patterns of pedoclimatic data, analyzed by geostatistics showed a strong positive correlation of disease incidence with higher values of rainfall, thermal shock and soil nitrogen; a weak positive correlation with soil aluminium content and a strong negative correlation with the values of Mg/K ratio. No correlation of the disease incidence was found with soil pH. Disease incidence ranged from very low (<1%) to very high (almost 75%) across the orchards. Young plants (4-year old) were the most affected by the disease confirming a weak negative correlation of the disease incidence with plant age. Plant cultivars did not show any difference in susceptibility to the pathogen. Possible role of climate change on the epidemiology of the disease is discussed. Improved management practices are recommended for effective control of the disease.
Assessing TCE source bioremediation by geostatistical analysis of a flux fence.
Cai, Zuansi; Wilson, Ryan D; Lerner, David N
2012-01-01
Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones.
A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Liang, Faming
2013-03-01
The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.
Tobar, I.; Lee, J.; Black, F. W.; Babamaaji, R. A.
2014-12-01
The Komadugu-Yobe River Basin in northeastern Nigeria is an important tributary of Lake Chad and has experienced significant changes in population density and land cover in recent decades. The present study focuses on the application of geostatistical methods to examine the land cover and population density dynamics in the river basin. The geostatistical methods include spatial autocorrelation, overlapping neighborhood statistics with Pearson's correlation coefficient, Moran's I index analysis, and indicator variogram analysis with rose diagram. The land cover and land use maps were constructed from USGS Landsat images and Globcover images from the European Space Agency. The target years of the analysis are 1970, 1986, 2000, 2005, and 2009. The calculation of net changes in land cover indicates significant variation in the changes of rainfed cropland, mosaic cropland, and grassland. Spatial autocorrelation analysis and Moran I index analysis showed that the distribution of land cover is highly clustered. A new GIS geostatistical tool was designed to calculate the overlapping neighborhood statistics with Pearson's correlation coefficient between the land use/land cover and population density datasets. The 10x10 neighborhood cell unit showed a clear correlation between the variables in certain zones of the study area. The ranges calculated from the indicator variograms of land use and land cover and population density showed that the cropland and sparse vegetation are most closely related to the spatial change of population density.
Reducing complexity of inverse problems using geostatistical priors
DEFF Research Database (Denmark)
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
a posterior sample, can be reduced significantly using informed priors based on geostatistical models. We discuss two approaches to include such geostatistically based prior information. One is based on a parametric description of the prior likelihood that applies to 2-point based statistical models...
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Visser, A.; Singleton, M. J.; Moran, J. E.; Fram, M. S.; Kulongoski, J. T.; Esser, B. K.
2014-12-01
Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge, are revealed in a spatial geostatistical analysis of the data set of tritium, dissolved noble gas and helium isotope analyses collected for the California State Water Resources Control Board's Groundwater Ambient Monitoring and Assessment (GAMA) and California Aquifer Susceptibility (CAS) programs. Over 4,000 tritium and noble gas analyses are available from wells across California. 25% of the analyzed samples contained less than 1 pCi/L indicating recharge occurred before 1950. The correlation length of tritium concentration is 120 km. Nearly 50% of the wells show a significant component of terrigenic helium. Over 50% of these samples show a terrigenic helium isotope ratio (Rter) that is significantly higher than the radiogenic helium isotope ratio (Rrad = 2×10-8). Rter values of more than three times the atmospheric isotope ratio (Ra = 1.384×10-6) are associated with known faults and volcanic provinces in Northern California. In the Central Valley, Rter varies from radiogenic to 2.25 Ra, complicating 3H/3He dating. The Rter was mapped by kriging, showing a correlation length of less than 50 km. The local predicted Rter was used to separate tritiogenic from atmospheric and terrigenic 3He. Regional groundwater recharge areas, indicated by young groundwater ages, are located in the southern Santa Clara Basin and in the upper LA basin and in the eastern San Joaquin Valley and along unlined canals carrying Colorado River water. Recharge in California is dominated by agricultural return flows, river recharge and managed aquifer recharge rather than precipitation excess. Combined application of noble gases and other groundwater tracers reveal the impact of engineered groundwater recharge and prove invaluable for the study of complex groundwater systems. This work was performed under the
Directory of Open Access Journals (Sweden)
Adéla Volfová
2012-10-01
Full Text Available Geostatistics is a scientific field which provides methods for processing spatial data. In our project, geostatistics is used as a tool for describing spatial continuity and making predictions of some natural phenomena. An open source statistical project called R is used for all calculations. Listeners will be provided with a brief introduction to R and its geostatistical packages and basic principles of kriging and cokriging methods. Heavy mathematical background is omitted due to its complexity. In the second part of the presentation, several examples are shown of how to make a prediction in the whole area of interest where observations were made in just a few points. Results of these methods are compared.
A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...
Satellite Magnetic Residuals Investigated With Geostatistical Methods
DEFF Research Database (Denmark)
Fox Maule, Chaterine; Mosegaard, Klaus; Olsen, Nils
2005-01-01
(which consists of measurement errors and unmodeled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyze the residuals of the Oersted (09d/04) field model (www.dsri.dk/Oersted/Field models/IGRF 2005 candidates/), which is based...
Hack, Daniel R.
2005-01-01
Sand-and-gravel (aggregate) resources are a critical component of the Nation's infrastructure, yet aggregate-mining technologies lag far behind those of metalliferous mining and other sectors. Deposit-evaluation and site-characterization methodologies are antiquated, and few serious studies of the potential applications of spatial-data analysis and geostatistics have been published. However, because of commodity usage and the necessary proximity of a mine to end use, aggregate-resource exploration and evaluation differ fundamentally from comparable activities for metalliferous ores. Acceptable practices, therefore, can reflect this cruder scale. The increasing use of computer technologies is colliding with the need for sand-and-gravel mines to modernize and improve their overall efficiency of exploration, mine planning, scheduling, automation, and other operations. The emergence of megaquarries in the 21st century will also be a contributing factor. Preliminary research into the practical applications of exploratory-data analysis (EDA) have been promising. For example, EDA was used to develop a linear-regression equation to forecast freeze-thaw durability from absorption values for Lower Paleozoic carbonate rocks mined for crushed aggregate from quarries in Oklahoma. Applications of EDA within a spatial context, a method of spatial-data analysis, have also been promising, as with the investigation of undeveloped sand-and-gravel resources in the sedimentary deposits of Pleistocene Lake Bonneville, Utah. Formal geostatistical investigations of sand-and-gravel deposits are quite rare, and the primary focus of those studies that have been completed is on the spatial characterization of deposit thickness and its subsequent effect on ore reserves. A thorough investigation of a gravel deposit in an active aggregate-mining area in central Essex, U.K., emphasized the problems inherent in the geostatistical characterization of particle-size-analysis data. Beyond such factors
Jaime-Garcia, R; Orum, T V; Felix-Gastelum, R; Trinidad-Correa, R; Vanetten, H D; Nelson, M R
2001-12-01
ABSTRACT Genetic structure of Phytophthora infestans, the causal agent of potato and tomato late blight, was analyzed spatially in a mixed potato and tomato production area in the Del Fuerte Valley, Sinaloa, Mexico. Isolates of P. infestans were characterized by mating type, allozyme analysis at the glucose-6-phosphate isomerase and peptidase loci, restriction fragment length polymorphism with probe RG57, metalaxyl sensitivity, and aggressiveness to tomato and potato. Spatial patterns of P. infestans genotypes were analyzed by geographical information systems and geo-statistics during the seasons of 1994-95, 1995-96, and 1996-97. Spatial analysis of the genetic structure of P. infestans indicates that geographic substructuring of this pathogen occurs in this area. Maps displaying the probabilities of occurrence of mating types and genotypes of P. infestans, and of disease severity at a regional scale, were presented. Some genotypes that exhibited differences in epidemiologically important features such as metalaxyl sensitivity and aggressiveness to tomato and potato had a restricted spread and were localized in isolated areas. Analysis of late blight severity showed recurring patterns, such as the earliest onset of the disease in the area where both potato and tomato were growing, strengthening the hypothesis that infected potato tubers are the main source of primary inoculum. The information that geostatistical analysis provides might help improve management programs for late blight in the Del Fuerte Valley.
Analysis of field-scale spatial correlations and variations of soil nutrients using geostatistics.
Liu, Ruimin; Xu, Fei; Yu, Wenwen; Shi, Jianhan; Zhang, Peipei; Shen, Zhenyao
2016-02-01
Spatial correlations and soil nutrient variations are important for soil nutrient management. They help to reduce the negative impacts of agricultural nonpoint source pollution. Based on the sampled available nitrogen (AN), available phosphorus (AP), and available potassium (AK), soil nutrient data from 2010, the spatial correlation, was analyzed, and the probabilities of the nutrient's abundance or deficiency were discussed. This paper presents a statistical approach to spatial analysis, the spatial correlation analysis (SCA), which was originally developed for describing heterogeneity in the presence of correlated variation and based on ordinary kriging (OK) results. Indicator kriging (IK) was used to assess the susceptibility of excess of soil nutrients based on crop needs. The kriged results showed there was a distinct spatial variability in the concentration of all three soil nutrients. High concentrations of these three soil nutrients were found near Anzhou. As the distance from the center of town increased, the concentration of the soil nutrients gradually decreased. Spatially, the relationship between AN and AP was negative, and the relationship between AP and AK was not clear. The IK results showed that there were few areas with a risk of AN and AP overabundance. However, almost the entire study region was at risk of AK overabundance. Based on the soil nutrient distribution results, it is clear that the spatial variability of the soil nutrients differed throughout the study region. This spatial soil nutrient variability might be caused by different fertilizer types and different fertilizing practices.
Visual Basic programs for one, two or three-dimensional geostatistical analysis
Carr, James R.; Mela, Kenneth
1998-07-01
Two previously published FORTRAN-77 programs, FGAM and JCBLOK, are rewritten in Visual Basic 5.0 for 32-bit Windows 95/NT and educational applications. Each program is applicable to spatial data representing one, two or three-dimensions. Graphics are added for displaying computed variograms and color density slices of kriging results within the same windows used to launch the programs. Dynamic array allocation is automatically invoked by the programs without the need for a user to intervene, thus enabling efficient memory management independent of data set size. If analyzing one-dimensional strings of data (profiles), fractal dimensions are computed for four-lag increments of the variogram, thus enabling a scale-dependent analysis. Only the raw, spatial data need to be in a separate file because program options are set interactively using mouse click events motivated by the design of the window for each program. Simplified Geo-EAS input format is accommodated for these files, or generic files are accommodated having allowable record lengths up to 100 values per record.
Directory of Open Access Journals (Sweden)
Nima Babanouri
2013-12-01
Full Text Available Three-dimensional surface geometry of rock discontinuities and its evolution with shearing are of great importance in understanding the deformability and hydro-mechanical behavior of rock masses. In the present research, surfaces of three natural rock fractures were digitized and studied before and after the direct shear test. The variography analysis of the surfaces indicated a strong non-linear trend in the data. Therefore, the spatial variability of rock fracture surfaces was decomposed to one deterministic component characterized by a base polynomial function, and one stochastic component described by the variogram of residuals. By using an image-processing technique, 343 damaged zones with different sizes, shapes, initial roughness characteristics, local stress fields, and asperity strength values were spatially located and clustered. In order to characterize the overall spatial structure of the degraded zones, the concept of ‘pseudo-zonal variogram’ was introduced. The results showed that the spatial continuity at the damage locations increased due to asperity degradation. The increase in the variogram range was anisotropic and tended to be higher in the shear direction; thus, the direction of maximum continuity rotated towards the shear direction. Finally, the regression-kriging method was used to reconstruct the morphology of the intact surfaces and degraded areas. The cross-validation error of interpolation for the damaged zones was found smaller than that obtained for the intact surface.
Ustaoglu, Beyza
2014-05-01
Rainfall is one of the most important climatic factor for environmental studies. Several methods (Thiessen polygon, Inverse Distance Weighting (IDW) and Kriging etc.) have been used by researchers for spatial interpolation of rainfall data. Kriging is a geostatistical method which is based on spatial correlation between neighbouring observations to predict attribute values at unsampled locations. The study area, Eastern Black Sea Basin is one of the highest rainfall accumulations in Turkey according to the measured station data (1942 - 2011). Eastern Black Sea Basin is the only basin in Turkey with an increase amount of winter (October, November, December) rainfall for 2013 in comparison to the long term mean and previous year winter rainfall. Regarding to the future projections (Ustaoglu, 2011), this basin has one of the strongest increasing trend according to the A2 scenario analysis obtained from RegCM3 regional climate model during the ten years periods (2011 - 2100). In this study, 2013 winter rainfall in the basin is highlighted and compared with the past and future rainfall conditions of the basin. Keywords: Geostatistical Analysis, Winter Rainfall, Eastern Black Sea Basin
Mixed-point geostatistical simulation: A combination of two- and multiple-point geostatistics
Cordua, Knud Skou; Hansen, Thomas Mejer; Gulbrandsen, Mats Lundh; Barnes, Christophe; Mosegaard, Klaus
2016-09-01
Multiple-point-based geostatistical methods are used to model complex geological structures. However, a training image containing the characteristic patterns of the Earth model has to be provided. If no training image is available, two-point (i.e., covariance-based) geostatistical methods are typically applied instead because these methods provide fewer constraints on the Earth model. This study is motivated by the case where 1-D vertical training images are available through borehole logs, whereas little or no information about horizontal dependencies exists. This problem is solved by developing theory that makes it possible to combine information from multiple- and two-point geostatistics for different directions, leading to a mixed-point geostatistical model. An example of combining information from the multiple-point-based single normal equation simulation algorithm and two-point-based sequential indicator simulation algorithm is provided. The mixed-point geostatistical model is used for conditional sequential simulation based on vertical training images from five borehole logs and a range parameter describing the horizontal dependencies.
Conversation Analysis in Applied Linguistics
DEFF Research Database (Denmark)
Kasper, Gabriele; Wagner, Johannes
2014-01-01
with understanding fundamental issues of talk in action and of intersubjectivity in human conduct. The field has expanded its scope from the analysis of talk—often phone calls—towards an integration of language with other semiotic resources for embodied action, including space and objects. Much of this expansion has...... been driven by applied work. After laying out CA's standard practices of data treatment and analysis, this article takes up the role of comparison as a fundamental analytical strategy and reviews recent developments into cross-linguistic and cross-cultural directions. The remaining article focuses......For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...
Applied analysis and differential equations
Cârj, Ovidiu
2007-01-01
This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.
10th International Geostatistics Congress
Rodrigo-Ilarri, Javier; Rodrigo-Clavero, María; Cassiraga, Eduardo; Vargas-Guzmán, José
2017-01-01
This book contains selected contributions presented at the 10th International Geostatistics Congress held in Valencia from 5 to 9 September, 2016. This is a quadrennial congress that serves as the meeting point for any engineer, professional, practitioner or scientist working in geostatistics. The book contains carefully reviewed papers on geostatistical theory and applications in fields such as mining engineering, petroleum engineering, environmental science, hydrology, ecology, and other fields.
Applied survival analysis using R
Moore, Dirk F
2016-01-01
Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...
The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.
León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen
2012-01-01
Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.
The Use of Geostatistics in the Study of Floral Phenology of Vulpia geniculata (L. Link
Directory of Open Access Journals (Sweden)
Eduardo J. León Ruiz
2012-01-01
Full Text Available Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L. Link throughout the study area during sampling season. Ten sampling points, scattered troughout the city and low mountains in the “Sierra de Córdoba” were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to ellaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.
Directory of Open Access Journals (Sweden)
Goovaerts Pierre
2006-11-01
Full Text Available Abstract Background Geostatistical techniques that account for spatially varying population sizes and spatial patterns in the filtering of choropleth maps of cancer mortality were recently developed. Their implementation was facilitated by the initial assumption that all geographical units are the same size and shape, which allowed the use of geographic centroids in semivariogram estimation and kriging. Another implicit assumption was that the population at risk is uniformly distributed within each unit. This paper presents a generalization of Poisson kriging whereby the size and shape of administrative units, as well as the population density, is incorporated into the filtering of noisy mortality rates and the creation of isopleth risk maps. An innovative procedure to infer the point-support semivariogram of the risk from aggregated rates (i.e. areal data is also proposed. Results The novel methodology is applied to age-adjusted lung and cervix cancer mortality rates recorded for white females in two contrasted county geographies: 1 state of Indiana that consists of 92 counties of fairly similar size and shape, and 2 four states in the Western US (Arizona, California, Nevada and Utah forming a set of 118 counties that are vastly different geographical units. Area-to-point (ATP Poisson kriging produces risk surfaces that are less smooth than the maps created by a naïve point kriging of empirical Bayesian smoothed rates. The coherence constraint of ATP kriging also ensures that the population-weighted average of risk estimates within each geographical unit equals the areal data for this unit. Simulation studies showed that the new approach yields more accurate predictions and confidence intervals than point kriging of areal data where all counties are simply collapsed into their respective polygon centroids. Its benefit over point kriging increases as the county geography becomes more heterogeneous. Conclusion A major limitation of choropleth
Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.
Monestiez, P; Goulard, M; Charmet, G
1994-04-01
Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.
El Sebai, T; Lagacherie, B; Soulas, G; Martin-Laurent, F
2007-02-01
We assessed the spatial variability of isoproturon mineralization in relation to that of physicochemical and biological parameters in fifty soil samples regularly collected along a sampling grid delimited across a 0.36 ha field plot (40 x 90 m). Only faint relationships were observed between isoproturon mineralization and the soil pH, microbial C biomass, and organic nitrogen. Considerable spatial variability was observed for six of the nine parameters tested (isoproturon mineralization rates, organic nitrogen, genetic structure of the microbial communities, soil pH, microbial biomass and equivalent humidity). The map of isoproturon mineralization rates distribution was similar to that of soil pH, microbial biomass, and organic nitrogen but different from those of structure of the microbial communities and equivalent humidity. Geostatistics revealed that the spatial heterogeneity in the rate of degradation of isoproturon corresponded to that of soil pH and microbial biomass.
7th International Geostatistics Congress
Deutsch, Clayton
2005-01-01
The conference proceedings consist of approximately 120 technical papers presented at the Seventh International Geostatistics Congress held in Banff, Alberta, Canada in 2004. All the papers were reviewed by an international panel of leading geostatisticians. The five major sections are: theory, mining, petroleum, environmental and other applications. The first section showcases new and innovative ideas in the theoretical development of geostatistics as a whole; these ideas will have large impact on (1) the directions of future geostatistical research, and (2) the conventional approaches to heterogeneity modelling in a wide range of natural resource industries. The next four sections are focused on applications and innovations relating to the use of geostatistics in specific industries. Historically, mining, petroleum and environmental industries have embraced the use of geostatistics for uncertainty characterization, so these three industries are identified as major application areas. The last section is open...
Energy Technology Data Exchange (ETDEWEB)
Haghighi-Fashi, F.; Sharifi, F.; Kamali, K.
2014-06-01
Knowledge of infiltration characteristics is useful in hydrological studies of agricultural soils. Soil hydraulic parameters such as steady infiltration rate, sorptivity, and transmissivity can exhibit appreciable spatial variability. The main objectives of this study were to examine several mathematical models of infiltration and to analyze the spatial variability of observed final infiltration rate, estimated sorptivity and estimated transmissivity in flood spreading and control areas in Ilam province, Iran. The suitability of geostatistics to describe such spatial variability was assessed using data from 30 infiltration measurements sampled along three lines. The Horton model provided the most accurate simulation of infiltration considering all measurements and the Philips two-term model provided less accurate simulation. A comparison of the measured values and the estimated final infiltration rates showed that the Kostiakov- Lewis, Kostiakov, and SCS models could not estimate the final infiltration rate as well as Horton model. Estimated sorptivity and transmissivity parameters of the Philips two-term model and final infiltration rate had spatial structure, and were considered to be structural variables over the transect pattern. The Gaussian model provided the best-fit theoretical variogram for these three parameters. Variogram values ranged from 99 and 88 m for sorptivity and final infiltration rate to 686 (spherical) and 384 m (Gaussian) for transmissivity. Sorptivity, transmissivity and final infiltration attributes showed a high degree of spatial dependence, being 0.99, 0.81 and 1, respectively. Results showed that kriging could be used to predict the studied parameters in the study area. (Author)
Luo, Lian-Cong; Qin, Bo-Qiang; Zhu, Guang-Wei
2004-01-01
Investigation was made into sediment depth at 723 irregularly scattered measurement points which cover all the regions in Taihu Lake, China. The combination of successive correction scheme and geostatistical method was used to get all the values of recent sediment thickness at the 69 x 69 grids in the whole lake. The results showed that there is the significant difference in sediment depth between the eastern area and the western region, and most of the sediments are located in the western shore-line and northern regimes but just a little in the center and eastern parts. The notable exception is the patch between the center and Xishan Island where the maximum sediment depth is more than 4.0 m. This sediment distribution pattern is more than likely related to the current circulation pattern induced by the prevailing wind-forcing in Taihu Lake. The numerical simulation of hydrodynamics can strong support the conclusion. Sediment effects on water quality was also studied and the results showed that the concentrations of TP, TN and SS in the western part are obviously larger than those in the eastern regime, which suggested that more nutrients can be released from thicker sediment areas.
Institute of Scientific and Technical Information of China (English)
LUO Lian-cong; QIN Bo-qiang; ZHU Guang-wei
2004-01-01
Investigation was made into sediment depth at 723 irregularly scattered measurement points which cover all the regions in Taihu Lake, China. The combination of successive correction scheme and geostatistical method was used to get all the values of recent sediment thickness at the 69×69 grids in the whole lake. The results showed that there is the significant difference in sediment depth between the eastern area and the western region, and most of the sediments are located in the western shore-line and northern regimes but just a little in the center and eastern parts. The notable exception is the patch between the center and Xishan Island where the maximum sediment depth is more than 4.0 m. This sediment distribution pattern is more than likely related to the current circulation pattern induced by the prevailing wind-forcing in Taihu Lake. The numerical simulation of hydrodynamics can strong support the conclusion. Sediment effects on water quality was also studied and the results showed that the concentrations of TP, TN and SS in the western part are obviously larger than those in the eastern regime, which suggested that more nutrients can be released from thicker sediment areas.
Institute of Scientific and Technical Information of China (English)
李志强; 梁广文; 岑伊静
2008-01-01
The citrus red mite, Panonychus citri (McGregor), is a key pest of citrus. Geostatistic method was applied to study the spatial pattern of citrus red mite population, in citrus orchard by the spatial analysis software Variowin 2.1, The results indicated that the spatial pattern of citrus red mite population can be described by geostatistic method, and the semivariogram of citrus red mite mainly fitted the gauss models with the ranges of 1.1-21.0 m. Citrus red mite population showed an aggregative distribution, and the aggregating intensities were relatively strong in March, August and September. The spatial pattern dynamics showed that two occurrence peaks of citrus red mite population occurred in April and October, specially in October, citrus red mite popula-tion rapidly diffused. March and September were two crucial stages of monitoring and treatment for citrus red mite.%应用地学统计学方法分析了柑橘园主要害螨柑橘全爪螨Panonychus citri(McGregor)种群的空间格局及其动态.结果表明,柑橘全爪螨种群具有空间相关性,变程介于1.10～21.0 m,其半变异函数主要符合高斯模型,表现为聚集分布,其中3月、8月和9月的聚集强度较大;种群空间格局动态显示,4月、10月为该种群的两个发生高峰期,柑橘全爪螨种群数量快速上升扩散.地学统计学方法能够应用于柑橘全爪螨种群的空间格局分析,并有助于对该害螨进行发生预测与控制处理.
Conversation Analysis and Applied Linguistics.
Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David
2002-01-01
Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)
Herzfeld, Ute Christina; Mayer, Helmut; Higginson, Chris A.; Matassa, Michael
1996-01-01
Geostatistical methods for interpolation and extrapolation techniques are used in glaciological data analysis. The results of a program involving the mapping of the Antarctica from satellite radar altimeter data are discussed. A combination of high and low resolution techniques was applied in the analysis of the Bering Glacier (Alaska). The global positioning system (GPS) located video data collected from small aircraft and the ERS-1 synthetic aperture radar (SAR) images were used. From the perspective of SAR data analysis, the Bering Glacier surge was the opportunity to characterize the surface of fast flowing ice and the rapid changes in the surface roughness.
Daniele, Linda; Pulido Bosch, Antonio; Vallejos, Angela; Molina, Luis
2008-06-01
The Aguadulce aquifer unit in southeastern Spain is a complex hydrogeological system because of the varied lithology of the aquifer strata and the variability of the processes that can take place within the unit. Factorial analysis of the data allowed the number of variables to be reduced to 3 factors, which were found to be related to such physico-chemical processes as marine intrusion and leaching of saline deposits. Variographic analysis was applied to these factors, culminating in a study of spatial distribution using ordinary kriging. Mapping of the factors allowed rapid differentiation of some of the processes that affect the waters of the Gador carbonate aquifer within the Aguadulce unit, without the need to recur to purely hydrogeochemical techniques. The results indicate the existence of several factors related to salinity: marine intrusion, paleowaters, and/or leaching of marls and evaporitic deposits. The techniques employed are effective, and the results conform to those obtained using hydrogeochemical methods (vertical records of conductivity and temperature, ion ratios, and others). The findings of this study confirm that the application of such analytical methods can provide a useful assessment of factors affecting groundwater composition.
Hellies, Matteo; Deidda, Roberto; Langousis, Andreas
2016-04-01
We study the extreme rainfall regime of the Island of Sardinia in Italy, based on annual maxima of daily precipitation. The statistical analysis is conducted using 229 daily rainfall records with at least 50 complete years of observations, collected at different sites by the Hydrological Survey of the Sardinia Region. Preliminary analysis, and the L-skewness and L-kurtosis diagrams, show that the Generalized Extreme Value (GEV) distribution model performs best in describing daily rainfall extremes. The GEV distribution parameters are estimated using the method of Probability Weighted Moments (PWM). To obtain extreme rainfall estimates at ungauged sites, while minimizing uncertainties due to sampling variability, a regional and a geostatistical approach are compared. The regional approach merges information from different gauged sites, within homogeneous regions, to obtain GEV parameter estimates at ungauged locations. The geostatistical approach infers the parameters of the GEV distribution model at locations where measurements are available, and then spatially interpolates them over the study region. In both approaches we use local rainfall means as index-rainfall. In the regional approach we define homogeneous regions by applying a hierarchical cluster analysis based on Ward's method, with L-moment ratios (i.e. L-CV and L-Skewness) as metrics. The analysis results in four contiguous regions, which satisfy the Hosking and Wallis (1997) homogeneity tests. The latter have been conducted using a Monte-Carlo approach based on a 4-parameter Kappa distribution model, fitted to each station cluster. Note that the 4-parameter Kappa model includes the GEV distribution as a sub-case, when the fourth parameter h is set to 0. In the geostatistical approach we apply kriging for uncertain data (KUD), which accounts for the error variance in local parameter estimation and, therefore, may serve as a useful tool for spatial interpolation of metrics affected by high uncertainty. In
Institute of Scientific and Technical Information of China (English)
陆锦; 任晓冬; 刘洪云
2015-01-01
基于ARCGIS平台Geostatistical Analyst模块对喀斯特地区土壤养分空间变异特征进行研究,以贵州省黔西县韦寨村为研究区,首先对研究区内耕地土壤进行取样,深度为0～ 20cm,并将研究区划分为两片,采样方式分别为25m* 25m和50m* 50m.通过实验得出土壤中全氮、有效磷、速效钾、有机质含量和pH值.采用GIS地统计分析和常规统计方法对土壤pH值及养分(全氮、有效磷、速效钾、有机质)的空间变异分析和合理采样数目研究.结果表明,研究区内全氮、有效磷、速效钾、pH、有机质均呈对数正态分布;全氮、有效磷、速效钾、pH、有机质均为中等变异强度;研究区内最优采样方式为50m* 50m.
Hydrogeologic unit flow characterization using transition probability geostatistics.
Jones, Norman L; Walker, Justin R; Carle, Steven F
2005-01-01
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.
Spatial Downscaling of TRMM Precipitation Using Geostatistics and Fine Scale Environmental Variables
Directory of Open Access Journals (Sweden)
No-Wook Park
2013-01-01
Full Text Available A geostatistical downscaling scheme is presented and can generate fine scale precipitation information from coarse scale Tropical Rainfall Measuring Mission (TRMM data by incorporating auxiliary fine scale environmental variables. Within the geostatistical framework, the TRMM precipitation data are first decomposed into trend and residual components. Quantitative relationships between coarse scale TRMM data and environmental variables are then estimated via regression analysis and used to derive trend components at a fine scale. Next, the residual components, which are the differences between the trend components and the original TRMM data, are then downscaled at a target fine scale via area-to-point kriging. The trend and residual components are finally added to generate fine scale precipitation estimates. Stochastic simulation is also applied to the residual components in order to generate multiple alternative realizations and to compute uncertainty measures. From an experiment using a digital elevation model (DEM and normalized difference vegetation index (NDVI, the geostatistical downscaling scheme generated the downscaling results that reflected detailed characteristics with better predictive performance, when compared with downscaling without the environmental variables. Multiple realizations and uncertainty measures from simulation also provided useful information for interpretations and further environmental modeling.
Geostatistical and Statistical Classification of Sea-Ice Properties and Provinces from SAR Data
Directory of Open Access Journals (Sweden)
Ute C. Herzfeld
2016-07-01
Full Text Available Recent drastic reductions in the Arctic sea-ice cover have raised an interest in understanding the role of sea ice in the global system as well as pointed out a need to understand the physical processes that lead to such changes. Satellite remote-sensing data provide important information about remote ice areas, and Synthetic Aperture Radar (SAR data have the advantages of penetration of the omnipresent cloud cover and of high spatial resolution. A challenge addressed in this paper is how to extract information on sea-ice types and sea-ice processes from SAR data. We introduce, validate and apply geostatistical and statistical approaches to automated classification of sea ice from SAR data, to be used as individual tools for mapping sea-ice properties and provinces or in combination. A key concept of the geostatistical classification method is the analysis of spatial surface structures and their anisotropies, more generally, of spatial surface roughness, at variable, intermediate-sized scales. The geostatistical approach utilizes vario parameters extracted from directional vario functions, the parameters can be mapped or combined into feature vectors for classification. The method is flexible with respect to window sizes and parameter types and detects anisotropies. In two applications to RADARSAT and ERS-2 SAR data from the area near Point Barrow, Alaska, it is demonstrated that vario-parameter maps may be utilized to distinguish regions of different sea-ice characteristics in the Beaufort Sea, the Chukchi Sea and in Elson Lagoon. In a third and a fourth case study the analysis is taken further by utilizing multi-parameter feature vectors as inputs for unsupervised and supervised statistical classification. Field measurements and high-resolution aerial observations serve as basis for validation of the geostatistical-statistical classification methods. A combination of supervised classification and vario-parameter mapping yields best results
Applying critical analysis - main methods
Directory of Open Access Journals (Sweden)
Miguel Araujo Alonso
2012-02-01
Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.
Energy Technology Data Exchange (ETDEWEB)
Soares, Carlos Moreira
1997-07-01
This study presents the characterization of the external geometry of deltaic oil reservoirs, including the description of their areal distribution using geo statistic tools, such as variography and kriging. A high-resolution stratigraphic study was developed over a 25 km{sup 2} area, by using data from 276 closely-spaced wells of an oil-producer field from the Reconcavo Basin, northeastern Brazil. The studied succession records the progressive lacustrine transgression of a deltaic environment. Core data and stratigraphic cross sections suggest that the oil reservoirs are mostly amalgamated, delta-front lobes, and subordinately, crevasse deposits. Some important geometrical elements were recognized by the detailed variographic analysis developed for each stratigraphic unit (zone). The average width for the groups of deltaic lobes of one zone was measured from the variographic feature informally named as hole effect. This procedure was not possible for the other zones due to the intense lateral amalgamation of sandstones, indicated by many variographic nested structures. Net sand krigged maps for the main zones suggest a NNW-SSE orientation for the deltaic lobes, as also their common amalgamation and compensation arrangements. High-resolution stratigraphic analyses should include a more regional characterization of the depositional system that comprises the studied succession. On the other hand, geostatistical studies should be developed only after the recognition of the depositional processes acting in the study area and the geological meaning of the variable to be treated, including its spatial variability scales as a function of sand body thickness, orientation and amalgamation. (author)
Essentials of applied dynamic analysis
Jia, Junbo
2014-01-01
This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.
因子分析与地统计学在化探数据分析中的应用%Application of factor analysis and geostatistics to geochemical data analysis
Institute of Scientific and Technical Information of China (English)
祁轶宏; 李晓晖; 霍立新
2013-01-01
Based on soil geochemical data of the Tongling mining camp,this paper obtained the information of a number of major factors from the geochemical data using factor analysis,and made spatial variation analysis for each major fac-tor and interpolation study using geostatistic method,indicating that each major factor acquired from factor analysis cor-responds to different ore-forming information,and combined use of factor analysis,geostatistic analysis and interpola-tion method can better display the spatial distribution trend of each major factor score and its correlation with known mineralization information,further serving metallogenic prognosis and mineral exploration.% 以铜陵矿集区土壤勘查地球化学数据为实例，应用因子分析方法获取了地球化学数据中的多个主因子信息，并利用地统计学方法开展了各主因子的空间变异分析和插值研究。研究结果显示，因子分析得到的各主因子对应于不同的成矿信息，将因子分析与地统计学分析和插值方法相结合，可以更好的展现各主因子得分的空间分布趋势以及与已知成矿信息的关联程度，进而服务于成矿预测和找矿勘探工作。
Namysłowska-Wilczyńska, Barbara
2016-09-01
This paper presents selected results of research connected with the development of a (3D) geostatistical hydrogeochemical model of the Kłodzko Drainage Basin, dedicated to the spatial variation in the different quality parameters of underground water in the water intake area (SW part of Poland). The research covers the period 2011-2012. Spatial analyses of the variation in various quality parameters, i.e., contents of: iron, manganese, ammonium ion, nitrate ion, phosphate ion, total organic carbon, pH redox potential and temperature, were carried out on the basis of the chemical determinations of the quality parameters of underground water samples taken from the wells in the water intake area. Spatial variation in the parameters was analyzed on the basis of data obtained (November 2011) from tests of water taken from 14 existing wells with a depth ranging from 9.5 to 38.0 m b.g.l. The latest data (January 2012) were obtained (gained) from 3 new piezometers, made in other locations in the relevant area. A depth of these piezometers amounts to 9-10 m. Data derived from 14 wells (2011) and 14 wells + 3 piezometers (2012) were subjected to spatial analyses using geostatistical methods. The evaluation of basic statistics of the quality parameters, including their histograms of distributions, scatter diagrams and correlation coefficient values r were presented. The directional semivariogram function γ(h) and the ordinary (block) kriging procedure were used to build the 3D geostatistical model. The geostatistical parameters of the theoretical models of directional semivariograms of the water quality parameters under study, calculated along the wells depth (taking into account the terrain elevation), were used in the ordinary (block) kriging estimation. The obtained results of estimation, i.e., block diagrams allowed us to determine the levels of increased values of estimated averages Z* of underground water quality parameters.
DEFF Research Database (Denmark)
Wieland, Kai; Rivoirard, J.
2001-01-01
are included in the estimation without any correction for possible daylight effects. In the present study, ordinary kriging was used to correct for sampling irregularities and external drift kriging with a day/night indicator or a cosine function of time of day was applied to account additionally for diurnal...... differences in the catch rates. Only minor differences between the standard indices and the abundance estimates obtained by ordinary kriging were found. In contrast, the external drift kriging, particularly with time of day, yielded higher estimates of mean abundance for all years with the differences...... to ordinary kriging being most pronounced for years characterized by a high portion of night hauls and a low mean catch rate at night. This demonstrates that external drift kriging with a day/night indicator but preferably with time of day is capable of compensating successfully for daylight effects...
Directory of Open Access Journals (Sweden)
Eva Vidal Vázquez
2009-04-01
Full Text Available Surface roughness can be influenced by type and intensity of soil tillage among other factors. In tilled soils microrelief may decay considerably as rain progresses. Geostatistics provides some tools that may be useful to study the dynamics of soil surface variability. The objective of this study was to show how it is possible to apply geostatistics to analyze soil microrelief variability. Data were taken at an Oxisol over six tillage treatments, namely, disk harrow, disk plow, chisel plow, disk harrow + disk level, disk plow + disk level and chisel plow + disk level. Measurements were made initially just after tillage and subsequently after cumulative natural rainfall events. Duplicated measurements were taken in each one of the treatments and dates of samplings, yielding a total of 48 experimental surfaces. A pin microrelief meter was used for the surface roughness measurements. The plot area was 1.35 × 1.35 m and the sample spacing was 25 mm, yielding a total of 3,025 data points per measurement. Before geostatistical analysis, trend was removed from the experimental data by two methods for comparison. Models were fitted to the semivariograms of each surface and the model parameters were analyzed. The trend removing method affected the geostatistical results. The geostatistical parameter dependence ratio showed that spatial dependence improved for most of the surfaces as the amount of cumulative rainfall increased.A rugosidade da superfície pode ser influenciada pelo tipo e pela intensidade do preparo do solo, entre outros fatores. Em solos preparados o microrrelevo é aplanado consideravelmente com o acúmulo da chuva. A Geoestatística promove algumas ferramentas que podem ser úteis no estudo da dinâmica da variabilidade da superfície do solo. O objetivo desse estudo foi verificar se é possível aplicar geoestatística na análise da variação do microrrelevo do solo. Os resultados foram obtidos num Latossolo sob seis tratamentos de
Geostatistical enhancement of european hydrological predictions
Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter
2016-04-01
Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a
Correspondence Analysis applied to psychological research
Laura Doey; Jessica Kurta
2011-01-01
Correspondence analysis is an exploratory data technique used to analyze categorical data (Benzecri, 1992). It is used in many areas such as marketing and ecology. Correspondence analysis has been used less often in psychological research, although it can be suitably applied. This article discusses the benefits of using correspondence analysis in psychological research and provides a tutorial on how to perform correspondence analysis using the Statistical Package for the Social Sciences (SPSS).
Correspondence Analysis applied to psychological research
Directory of Open Access Journals (Sweden)
Laura Doey
2011-04-01
Full Text Available Correspondence analysis is an exploratory data technique used to analyze categorical data (Benzecri, 1992. It is used in many areas such as marketing and ecology. Correspondence analysis has been used less often in psychological research, although it can be suitably applied. This article discusses the benefits of using correspondence analysis in psychological research and provides a tutorial on how to perform correspondence analysis using the Statistical Package for the Social Sciences (SPSS.
Concept analysis of culture applied to nursing.
Marzilli, Colleen
2014-01-01
Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.
Bayesian modelling of geostatistical malaria risk data
Directory of Open Access Journals (Sweden)
L. Gosoniu
2006-11-01
Full Text Available Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.
Bayesian modelling of geostatistical malaria risk data.
Gosoniu, L; Vounatsou, P; Sogoba, N; Smith, T
2006-11-01
Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.
Applied regression analysis a research tool
Pantula, Sastry; Dickey, David
1998-01-01
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
Source Apportionment of Heavy Metals in Soils Using Multivariate Statistics and Geostatistics
Institute of Scientific and Technical Information of China (English)
QU Ming-Kai; LI Wei-Dong; ZHANG Chuan-Rong; WANG Shan-Qin; YANG Yong; HE Li-Yuan
2013-01-01
The main objectives of this study were to introduce an integrated method for effectively identifying soil heavy metal pollution sources and apportioning their contributions,and apply it to a case study.The method combines the principal component analysis/absolute principal component scores (PCA/APCS) receptor model and geostatistics.The case study was conducted in an area of 31 km2 in the urban-rural transition zone of Wuhan,a metropolis of central China.124 topsoil samples were collected for measuring the concentrations of eight heavy metal elements (Mn,Cu,Zn,Pb,Cd,Cr,Ni and Co).PCA results revealed that three major factors were responsible for soil heavy metal pollution,which were initially identified as "steel production","agronomic input" and "coal consumption".The APCS technique,combined with multiple linear regression analysis,was then applied for source apportionment.Steel production appeared to be the main source for Ni,Co,Cd,Zn and Mn,agronomic input for Cu,and coal consumption for Pb and Cr.Geostatistical interpolation using ordinary kriging was finally used to map the spatial distributions of the contributions of pollution sources and further confirm the result interpretations.The introduced method appears to be an effective tool in soil pollution source apportionment and identification,and might provide valuable reference information for pollution control and environmental management.
Applying WCET Analysis at Architectural Level
Gilles, Olivier; Hugues, Jérôme
2008-01-01
Real-Time embedded systems must enforce strict timing constraints. In this context, achieving precise Worst Case Execution Time is a prerequisite to apply scheduling analysis and verify system viability. WCET analysis is usually a complex and time-consuming activity. It becomes increasingly complex when one also considers code generation strategies from high-level models. In this paper, we present an experiment made on the coupling of the WCET analysis tool Bound-T and our AADL to code ...
The role of geostatistics in medical geology
Goovaerts, Pierre
2014-05-01
Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say prostate cancer mortality, including a study of populations residing in Utah. The information available for the present ecological study (i.e. analysis of aggregated health outcomes) consist of: 1) 9,188 arsenic concentrations measured at 8,212 different private wells that were sampled between 1993 and 2002, 2) prostate cancer incidence recorded at the township level over the period 1985-2002, and 3) block-group population density that served as proxy for
Farias, Paulo R S; Barbosa, José C; Busoli, Antonio C; Overal, William L; Miranda, Vicente S; Ribeiro, Susane M
2008-01-01
The fall armyworm, Spodoptera frugiperda (J.E. Smith), is one of the chief pests of maize in the Americas. The study of its spatial distribution is fundamental for designing correct control strategies, improving sampling methods, determining actual and potential crop losses, and adopting precise agricultural techniques. In São Paulo state, Brazil, a maize field was sampled at weekly intervals, from germination through harvest, for caterpillar densities, using quadrates. In each of 200 quadrates, 10 plants were sampled per week. Harvest weights were obtained in the field for each quadrate, and ear diameters and lengths were also sampled (15 ears per quadrate) and used to estimate potential productivity of the quadrate. Geostatistical analyses of caterpillar densities showed greatest ranges for small caterpillars when semivariograms were adjusted for a spherical model that showed greatest fit. As the caterpillars developed in the field, their spatial distribution became increasingly random, as shown by a model adjusted to a straight line, indicating a lack of spatial dependence among samples. Harvest weight and ear length followed the spherical model, indicating the existence of spatial variability of the production parameters in the maize field. Geostatistics shows promise for the application of precise methods in the integrated control of pests.
A geostatistical approach to estimate mining efficiency indicators with flexible meshes
Freixas, Genis; Garriga, David; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier
2014-05-01
Geostatistics is a branch of statistics developed originally to predict probability distributions of ore grades for mining operations by considering the attributes of a geological formation at unknown locations as a set of correlated random variables. Mining exploitations typically aim to maintain acceptable mineral laws to produce commercial products based upon demand. In this context, we present a new geostatistical methodology to estimate strategic efficiency maps that incorporate hydraulic test data, the evolution of concentrations with time obtained from chemical analysis (packer tests and production wells) as well as hydraulic head variations. The methodology is applied to a salt basin in South America. The exploitation is based on the extraction of brines through vertical and horizontal wells. Thereafter, brines are precipitated in evaporation ponds to obtain target potassium and magnesium salts of economic interest. Lithium carbonate is obtained as a byproduct of the production of potassium chloride. Aside from providing an assemble of traditional geostatistical methods, the strength of this study falls with the new methodology developed, which focus on finding the best sites to exploit the brines while maintaining efficiency criteria. Thus, some strategic indicator efficiency maps have been developed under the specific criteria imposed by exploitation standards to incorporate new extraction wells in new areas that would allow maintain or improve production. Results show that the uncertainty quantification of the efficiency plays a dominant role and that the use flexible meshes, which properly describe the curvilinear features associated with vertical stratification, provides a more consistent estimation of the geological processes. Moreover, we demonstrate that the vertical correlation structure at the given salt basin is essentially linked to variations in the formation thickness, which calls for flexible meshes and non-stationarity stochastic processes.
Positive Behavior Support and Applied Behavior Analysis
Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.
2006-01-01
This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…
Directory of Open Access Journals (Sweden)
Md. Bodrud-Doza
2016-04-01
Full Text Available This study investigates the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points. Water evaluation indices and a number of statistical approaches such as multivariate statistics and geostatistics are applied to characterize water quality, which is a major factor for controlling the groundwater quality in term of drinking purposes. The study reveal that EC, TDS, Ca2+, total As and Fe values of groundwater samples exceeded Bangladesh and international standards. Ground water quality index (GWQI exhibited that about 47% of the samples were belonging to good quality water for drinking purposes. The heavy metal pollution index (HPI, degree of contamination (Cd, heavy metal evaluation index (HEI reveal that most of the samples belong to low level of pollution. However, Cd provide better alternative than other indices. Principle component analysis (PCA suggests that groundwater quality is mainly related to geogenic (rock–water interaction and anthropogenic source (agrogenic and domestic sewage in the study area. Subsequently, the findings of cluster analysis (CA and correlation matrix (CM are also consistent with the PCA results. The spatial distributions of groundwater quality parameters are determined by geostatistical modeling. The exponential semivariagram model is validated as the best fitted models for most of the indices values. It is expected that outcomes of the study will provide insights for decision makers taking proper measures for groundwater quality management in central Bangladesh.
Application of geostatistics to risk assessment.
Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M
2003-10-01
Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.
Applied Data Analysis in Energy Monitoring System
Directory of Open Access Journals (Sweden)
Kychkin А.V.
2016-08-01
Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».
2nd European Conference on Geostatistics for Environmental Applications
Soares, Amílcar; Froidevaux, Roland
1999-01-01
The Second European Conference on Geostatistics for Environmental Ap plications took place in Valencia, November 18-20, 1998. Two years have past from the first meeting in Lisbon and the geostatistical community has kept active in the environmental field. In these days of congress inflation, we feel that continuity can only be achieved by ensuring quality in the papers. For this reason, all papers in the book have been reviewed by, at least, two referees, and care has been taken to ensure that the reviewer comments have been incorporated in the final version of the manuscript. We are thankful to the members of the scientific committee for their timely review of the scripts. All in all, there are three keynote papers from experts in soil science, climatology and ecology and 43 contributed papers providing a good indication of the status of geostatistics as applied in the environ mental field all over the world. We feel now confident that the geoENV conference series, seeded around a coffee table almost six...
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Energy Technology Data Exchange (ETDEWEB)
Mundim, Evaldo Cesario
1999-02-01
In this dissertation the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial, domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes via External Drift Kriging and Collocated Cokriging in the estimate of reservoir characterization is discussed. Its relevance for the reservoir porous volume calculation is also evaluated based on comparative analysis of the volume risk curves derived from stochastic conditional simulations with collocated variable and stochastic conditional simulations with collocated variable and stochastic conditional simulations with external drift. results prove Factorial Kriging as an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)
Sneak analysis applied to process systems
Whetton, Cris
Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.
Directory of Open Access Journals (Sweden)
Nguyen Thi Thu Ha
2013-12-01
Full Text Available Sea eutrophication is a natural process of water enrichment caused by increased nutrient loading that severely affects coastal ecosystems by decreasing water quality. The degree of eutrophication can be assessed by chlorophyll-a concentration. This study aims to develop a remote sensing method suitable for estimating chlorophyll-a concentrations in tropical coastal waters with abundant phytoplankton using Moderate Resolution Imaging Spectroradiometer (MODIS/Terra imagery and to improve the spatial resolution of MODIS/Terra-based estimation from 1 km to 100 m by geostatistics. A model based on the ratio of green and blue band reflectance (rGBr is proposed considering the bio-optical property of chlorophyll-a. Tien Yen Bay in northern Vietnam, a typical phytoplankton-rich coastal area, was selected as a case study site. The superiority of rGBr over two existing representative models, based on the blue-green band ratio and the red-near infrared band ratio, was demonstrated by a high correlation of the estimated chlorophyll-a concentrations at 40 sites with values measured in situ. Ordinary kriging was then shown to be highly capable of predicting the concentration for regions of the image covered by clouds and, thus, without sea surface data. Resultant space-time maps of concentrations over a year clarified that Tien Yen Bay is characterized by natural eutrophic waters, because the average of chlorophyll-a concentrations exceeded 10 mg/m3 in the summer. The temporal changes of chlorophyll-a concentrations were consistent with average monthly air temperatures and precipitation. Consequently, a combination of rGBr and ordinary kriging can effectively monitor water quality in tropical shallow waters.
Karacan, C. Özgen; Olea, Ricardo A.
2013-01-01
Coal seam degasification and its success are important for controlling methane, and thus for the health and safety of coal miners. During the course of degasification, properties of coal seams change. Thus, the changes in coal reservoir conditions and in-place gas content as well as methane emission potential into mines should be evaluated by examining time-dependent changes and the presence of major heterogeneities and geological discontinuities in the field. In this work, time-lapsed reservoir and fluid storage properties of the New Castle coal seam, Mary Lee/Blue Creek seam, and Jagger seam of Black Warrior Basin, Alabama, were determined from gas and water production history matching and production forecasting of vertical degasification wellbores. These properties were combined with isotherm and other important data to compute gas-in-place (GIP) and its change with time at borehole locations. Time-lapsed training images (TIs) of GIP and GIP difference corresponding to each coal and date were generated by using these point-wise data and Voronoi decomposition on the TI grid, which included faults as discontinuities for expansion of Voronoi regions. Filter-based multiple-point geostatistical simulations, which were preferred in this study due to anisotropies and discontinuities in the area, were used to predict time-lapsed GIP distributions within the study area. Performed simulations were used for mapping spatial time-lapsed methane quantities as well as their uncertainties within the study area.
On the geostatistical characterization of hierarchical media
Neuman, Shlomo P.; Riva, Monica; Guadagnini, Alberto
2008-02-01
The subsurface consists of porous and fractured materials exhibiting a hierarchical geologic structure, which gives rise to systematic and random spatial and directional variations in hydraulic and transport properties on a multiplicity of scales. Traditional geostatistical moment analysis allows one to infer the spatial covariance structure of such hierarchical, multiscale geologic materials on the basis of numerous measurements on a given support scale across a domain or "window" of a given length scale. The resultant sample variogram often appears to fit a stationary variogram model with constant variance (sill) and integral (spatial correlation) scale. In fact, some authors, who recognize that hierarchical sedimentary architecture and associated log hydraulic conductivity fields tend to be nonstationary, nevertheless associate them with stationary "exponential-like" transition probabilities and variograms, respectively, the latter being a consequence of the former. We propose that (1) the apparent ability of stationary spatial statistics to characterize the covariance structure of nonstationary hierarchical media is an artifact stemming from the finite size of the windows within which geologic and hydrologic variables are ubiquitously sampled, and (2) the artifact is eliminated upon characterizing the covariance structure of such media with the aid of truncated power variograms, which represent stationary random fields obtained upon sampling a nonstationary fractal over finite windows. To support our opinion, we note that truncated power variograms arise formally when a hierarchical medium is sampled jointly across all geologic categories and scales within a window; cite direct evidence that geostatistical parameters (variance and integral scale) inferred on the basis of traditional variograms vary systematically with support and window scales; demonstrate the ability of truncated power models to capture these variations in terms of a few scaling parameters
Exergy analysis applied to biodiesel production
Energy Technology Data Exchange (ETDEWEB)
Talens, Laura; Villalba, Gara [SosteniPra UAB-IRTA. Environmental Science and Technology Institute (ICTA), Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra, Cerdanyola del Valles, Barcelona (Spain); Gabarrell, Xavier [SosteniPra UAB-IRTA. Environmental Science and Technology Institute ICTA, Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Department of Chemical Engineering, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra Cerdanyola del Valles, Barcelona (Spain)
2007-08-15
In our aim to decrease the consumption of materials and energy and promote the use of renewable resources, such as biofuels, rises the need to measure materials and energy fluxes. This paper suggests the use of Exergy Flow Analysis (ExFA) as an environmental assessment tool to account wastes and emissions, determine the exergetic efficiency, compare substitutes and other types of energy sources: all useful in defining environmental and economical policies for resource use. In order to illustrate how ExFA is used, it is applied to the process of biodiesel production. The results show that the production process has a low exergy loss (492 MJ). The exergy loss is reduced by using potassium hydroxide and sulphuric acid as process catalysts and it can be further minimised by improving the quality of the used cooking oil. (author)
Directory of Open Access Journals (Sweden)
Annamaria Castrignanò
2015-07-01
Full Text Available Fundamental to the philosophy of Precision Agriculture (PA is the concept of matching inputs to needs. Recent research in PA has focused on use of Management Zones (MZ that are field areas characterised by homogeneous attributes in landscape and soil conditions. Proximal sensing (such as Electromagnetic Induction (EMI, Ground Penetrating Radar (GPR and X-ray fluorescence can complement direct sampling and a multisensory platform can enable us to map soil features unambiguously. Several methods of multi-sensor data analysis have been developed to determine the location of subfield areas. Modern geostatistical techniques, treating variables as continua in a joint attribute and geographic space, offer the potential to analyse such data effectively. The objective of the paper is to show the potential of multivariate geostatistics to create MZ in the perspective of PA by integrating field data from different types of sensors, describing two study cases. In particular, in the first case study, cokriging and factorial cokriging were employed to produce thematic maps of soil trace elements and to delineate homogenous zones, respectively. In the second case, a multivariate geostatistical data-fusion technique (multi collocated cokriging was applied to different geophysical sensor data (GPR and EMI, for stationary estimation of soil water content and for delineating within-field zone with different wetting degree. The results have shown that linking sensors of different type improves the overall assessment of soil and sensor data fusion could be effectively applied to delineate MZs in Precision Agriculture. However, techniques of data integration are urgently required as a result of the proliferation of data from different sources.
Geostatistical Study of Precipitation on the Island of Crete
Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.
2015-04-01
Understanding and predicting the spatiotemporal patterns of precipitation in the Mediterranean islands is an important topic of research, which is emphasized by alarming long-term predictions for increased drought conditions [4]. The analysis of records from drought-prone areas around the world has demonstrated that precipitation data are non-Gaussian. Typically, such data are fitted to the gamma distribution function and then transformed into a normalized index, the so-called Standardized Precipitation Index (SPI) [5]. The SPI can be defined for different time scales and has been applied to data from various regions [2]. Precipitation maps can be constructed using the stochastic method of Ordinary Kriging [1]. Such mathematical tools help to better understand the space-time variability and to plan water resources management. We present preliminary results of an ongoing investigation of the space-time precipitation distribution on the island of Crete (Greece). The study spans the time period from 1948 to 2012 and extends over an area of 8 336 km2. The data comprise monthly precipitation measured at 56 stations. Analysis of the data showed that the most severe drought occurred in 1950 followed by 1989, whereas the wettest year was 2002 followed by 1977. A spatial trend was observed with the spatially averaged annual precipitation in the West measured at about 450mm higher than in the East. Analysis of the data also revealed strong correlations between the precipitation in the western and eastern parts of the island. In addition to longitude, elevation (masl) was determined to be an important factor that exhibits strong linear correlation with precipitation. The precipitation data exhibit wet and dry periods with strong variability even during the wet period. Thus, fitting the data to specific probability distribution models has proved challenging. Different time scales, e.g. monthly, biannual, and annual have been investigated. Herein we focus on annual
DEFF Research Database (Denmark)
Kessler, Timo Christian; Klint, K.E.S.; Renard, P.;
2010-01-01
at a clay till outcrop in Denmark to characterise the shapes and the spatial variability. Further, geostatistics were applied to simulate the distribution and to develop a heterogeneity model that can be incorporated into an existing geological model of, for example, a contaminated site....
Social network analysis applied to team sports analysis
Clemente, Filipe Manuel; Mendes, Rui Sousa
2016-01-01
Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.
Energy Technology Data Exchange (ETDEWEB)
Edwards, Lloyd A. [Leading Solutions, LLC.; Paresol, Bernard [U.S. Department of Agriculture, Forest Service, Pacific Northwest Research Station, Portland, OR.
2014-09-01
This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).
[Spatial distribution pattern of Chilo suppressalis analyzed by classical method and geostatistics].
Yuan, Zheming; Fu, Wei; Li, Fangyi
2004-04-01
Two original samples of Chilo suppressalis and their grid, random and sequence samples were analyzed by classical method and geostatistics to characterize the spatial distribution pattern of C. suppressalis. The limitations of spatial distribution analysis with classical method, especially influenced by the original position of grid, were summarized rather completely. On the contrary, geostatistics characterized well the spatial distribution pattern, congregation intensity and spatial heterogeneity of C. suppressalis. According to geostatistics, the population was up to Poisson distribution in low density. As for higher density population, its distribution was up to aggregative, and the aggregation intensity and dependence range were 0.1056 and 193 cm, respectively. Spatial heterogeneity was also found in the higher density population. Its spatial correlativity in line direction was more closely than that in row direction, and the dependence ranges in line and row direction were 115 and 264 cm, respectively.
Geostatistical modeling of topography using auxiliary maps
Hengl, T.; Bajat, B.; Blagojević, D.; Reuter, H.I.
2008-01-01
This paper recommends computational procedures for employing auxiliary maps, such as maps of drainage patterns, land cover and remote-sensing-based indices, directly in the geostatistical modeling of topography. The methodology is based on the regression-kriging technique, as implemented in the R pa
Colilert® applied to food analysis
Directory of Open Access Journals (Sweden)
Maria José Rodrigues
2014-06-01
Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.
Functional Analysis in Applied Mathematics and Engineering
DEFF Research Database (Denmark)
Pedersen, Michael
1997-01-01
Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title.......Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title....
Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.
Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H
2014-01-10
Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.
Assessment of spatial distribution of fallout radionuclides through geostatistics concept.
Mabit, L; Bernard, C
2007-01-01
After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution.
Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics
Directory of Open Access Journals (Sweden)
Swatantra R. Kethireddy
2014-01-01
Full Text Available Tropospheric ozone (O3 pollution is a major problem worldwide, including in the United States of America (USA, particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.
Introduction: Conversation Analysis in Applied Linguistics
Sert, Olcay; Seedhouse, Paul
2011-01-01
This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…
Applied surface analysis in magnetic storage technology
Windeln, Johannes; Bram, Christian; Eckes, Heinz-Ludwig; Hammel, Dirk; Huth, Johanna; Marien, Jan; Röhl, Holger; Schug, Christoph; Wahl, Michael; Wienss, Andreas
2001-07-01
This paper gives a synopsis of today's challenges and requirements for a surface analysis and materials science laboratory with a special focus on magnetic recording technology. The critical magnetic recording components, i.e. the protective carbon overcoat (COC), the disk layer structure, the read/write head including the giant-magnetoresistive (GMR) sensor, are described and options for their characterization with specific surface and structure analysis techniques are given. For COC investigations, applications of Raman spectroscopy to the structural analysis and determination of thickness, hydrogen and nitrogen content are discussed. Hardness measurements by atomic force microscopy (AFM) scratching techniques are presented. Surface adsorption phenomena on disk substrates or finished disks are characterized by contact angle analysis or so-called piezo-electric mass adsorption systems (PEMAS), also known as quartz crystal microbalance (QCM). A quickly growing field of applications is listed for various X-ray analysis techniques, such as disk magnetic layer texture analysis for X-ray diffraction, compositional characterization via X-ray fluorescence, compositional analysis with high lateral resolution via electron microprobe analysis. X-ray reflectometry (XRR) has become a standard method for the absolute measurement of individual layer thicknesses contained in multi-layer stacks and thus, is the successor of ellipsometry for this application. Due to the ongoing reduction of critical feature sizes, the analytical challenges in terms of lateral resolution, sensitivity limits and dedicated nano-preparation have been consistently growing and can only be met by state-of-the-art Auger electron spectrometers (AES), transmission electron microscopy (TEM) analysis, time-of-flight-secondary ion mass spectroscopy (ToF-SIMS) characterization, focused ion beam (FIB) sectioning and TEM lamella preparation via FIB. The depth profiling of GMR sensor full stacks was significantly
Functional Data Analysis Applied in Chemometrics
DEFF Research Database (Denmark)
Muller, Martha
the worlds of statistics and chemometrics. We want to provide a glimpse of the essential and complex data pre-processing that is well known to chemometricians, but is generally unknown to statisticians. Pre-processing can potentially have a strong in uence on the results of consequent data analysis. Our......In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... known by Ramsay & Silverman's monograph in 1997. In functional data analysis, the data are curves instead of data points. Each curve is measured at discrete points along a continuum, for example, time or frequency. It is assumed that the underlying process generating the curves is smooth...
Applied time series analysis and innovative computing
Ao, Sio-Iong
2010-01-01
This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.
Spatial analysis methodology applied to rural electrification
Energy Technology Data Exchange (ETDEWEB)
Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)
2006-08-15
The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)
An elementary introduction to applied signal analysis
DEFF Research Database (Denmark)
Jacobsen, Finn
2000-01-01
An introduction to some of the most fundamental concepts and methods of signal analysis and signal processing is presented with particular regard to acoustic measurements. The purpose is to give the reader so much basic knowledge of signal analysis that he can use modern digital equipment in some...... of the most important acoustic measurements, eg measurements of transfer functions of lightly damped multi-modal systems (rooms and structures)....
An elementary introduction to applied signal analysis
DEFF Research Database (Denmark)
Jacobsen, Finn
1997-01-01
An introduction to some of the most fundamental concepts and methods of signal analysis and signal processing is presented with particular regard to acoustic measurements. The purpose is to give the reader so much basic knowledge of signal analysis that he can use modern digital equipment in some...... of the most important acoustic measurements, eg measurements of transfer functions of lightly damped multi-modal systems (rooms and structures)....
Applied modal analysis of wind turbine blades
DEFF Research Database (Denmark)
Pedersen, H.B.; Kristensen, O.J.D.
2003-01-01
In this project modal analysis has been used to determine the natural frequencies, damping and the mode shapes for wind turbine blades. Different methods to measure the position and adjust the direction of the measuring points are discussed. Differentequipment for mounting the accelerometers...... is investigated by repeated measurement on the same wind turbine blade. Furthermore the flexibility of the test set-up is investigated, by use ofaccelerometers mounted on the flexible adapter plate during the measurement campaign. One experimental campaign investigated the results obtained from a loaded...... and unloaded wind turbine blade. During this campaign the modal analysis are performed on ablade mounted in a horizontal and a vertical position respectively. Finally the results obtained from modal analysis carried out on a wind turbine blade are compared with results obtained from the Stig Øyes blade_EV1...
Applied quantitative analysis in the social sciences
Petscher, Yaacov; Compton, Donald L
2013-01-01
To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model
Applying Image Matching to Video Analysis
2010-09-01
Database of Spent Cartridge Cases of Firearms". Forensic Science International . Page(s) 97-106. 2001. 21: Birchfield, S. "Derivation of Kanade-Lucas-Tomasi...Ortega-Garcia, J. "Bayesian Analysis of Fingerprint, Face and Signature Evidences with Automatic Biometric Systems". Forensic Science International . Vol
Applied Spectrophotometry: Analysis of a Biochemical Mixture
Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene
2013-01-01
Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…
Applications of geostatistics in plant nematology.
Wallace, M K; Hawkins, D M
1994-12-01
The application of geostatistics to plant nematology was made by evaluating soil and nematode data acquired from 200 soil samples collected from the A(p) horizon of a reed canary-grass field in northern Minnesota. Geostatistical concepts relevant to nematology include semi-variogram modelling, kriging, and change of support calculations. Soil and nematode data generally followed a spherical semi-variogram model, with little random variability associated with soil data and large inherent variability for nematode data. Block kriging of soil and nematode data provided useful contour maps of the data. Change of snpport calculations indicated that most of the random variation in nematode data was due to short-range spatial variability in the nematode population densities.
Spatial prediction of soil penetration resistance using functional geostatistics
Directory of Open Access Journals (Sweden)
Diego Leonardo Cortés-D
Full Text Available ABSTRACT Knowledge of agricultural soils is a relevant factor for the sustainable development of farming activities. Studies on agricultural soils usually begin with the analysis of data obtained from sampling a finite number of sites in a particular region of interest. The variables measured at each site can be scalar (chemical properties or functional (infiltration water or penetration resistance. The use of functional geostatistics (FG allows to perform spatial curve interpolation to generate prediction curves (instead of single variables at sites that lack information. This study analyzed soil penetration resistance (PR data measured between 0 and 35 cm depth at 75 sites within a 37 ha plot dedicated to livestock. The data from each site were converted to curves using non-parametric smoothing techniques. In this study, a B-splines basis of 18 functions was used to estimate PR curves for each of the 75 sites. The applicability of FG as a spatial prediction tool for PR curves was then evaluated using cross-validation, and the results were compared with classical spatial prediction methods (univariate geostatistics that are generally used for studying this type of information. We concluded that FG is a reliable tool for analyzing PR because a high correlation was obtained between the observed and predicted curves (R2 = 94 %. In addition, the results from descriptive analyses calculated from field data and FG models were similar for the observed and predicted values.
Validating spatial structure in canopy water content using geostatistics
Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.
1995-01-01
Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.
Applying centrality measures to impact analysis: A coauthorship network analysis
Yan, Erjia
2010-01-01
Many studies on coauthorship networks focus on network topology and network statistical mechanics. This article takes a different approach by studying micro-level network properties, with the aim to apply centrality measures to impact analysis. Using coauthorship data from 16 journals in the field of library and information science (LIS) with a time span of twenty years (1988-2007), we construct an evolving coauthorship network and calculate four centrality measures (closeness, betweenness, degree and PageRank) for authors in this network. We find out that the four centrality measures are significantly correlated with citation counts. We also discuss the usability of centrality measures in author ranking, and suggest that centrality measures can be useful indicators for impact analysis.
Thermal analysis applied to irradiated propolis
Energy Technology Data Exchange (ETDEWEB)
Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br
2002-03-01
Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.
Artificial intelligence applied to process signal analysis
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
Photometric analysis applied in determining facial type
Directory of Open Access Journals (Sweden)
Luciana Flaquer Martins
2012-10-01
Full Text Available INTRODUCTION: In orthodontics, determining the facial type is a key element in the prescription of a correct diagnosis. In the early days of our specialty, observation and measurement of craniofacial structures were done directly on the face, in photographs or plaster casts. With the development of radiographic methods, cephalometric analysis replaced the direct facial analysis. Seeking to validate the analysis of facial soft tissues, this work compares two different methods used to determining the facial types, the anthropometric and the cephalometric methods. METHODS: The sample consisted of sixty-four Brazilian individuals, adults, Caucasian, of both genders, who agreed to participate in this research. All individuals had lateral cephalograms and facial frontal photographs. The facial types were determined by the Vert Index (cephalometric and the Facial Index (photographs. RESULTS: The agreement analysis (Kappa, made for both types of analysis, found an agreement of 76.5%. CONCLUSIONS: We concluded that the Facial Index can be used as an adjunct to orthodontic diagnosis, or as an alternative method for pre-selection of a sample, avoiding that research subjects have to undergo unnecessary tests.INTRODUÇÃO: em Ortodontia, a determinação do tipo facial é um elemento-chave na prescrição de um diagnóstico correto. Nos primórdios de nossa especialidade, a observação e a medição das estruturas craniofaciais eram feitas diretamente na face, em fotografias ou em modelos de gesso. Com o desenvolvimento dos métodos radiográficos, a análise cefalométrica foi substituindo a análise facial direta. Visando legitimar o estudo dos tecidos moles faciais, esse trabalho comparou a determinação do tipo facial pelos métodos antropométrico e cefalométrico. MÉTODOS: a amostra constou de sessenta e quatro indivíduos brasileiros, adultos, leucodermas, de ambos os sexos, que aceitaram participar da pesquisa. De todos os indivíduos da amostra
Multivariate analysis applied to tomato hybrid production.
Balasch, S; Nuez, F; Palomares, G; Cuartero, J
1984-11-01
Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.
Toward applied behavior analysis of life aloft
Brady, J. V.
1990-01-01
This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most
Digital photoelastic analysis applied to implant dentistry
Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.
2016-12-01
Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.
Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process
Motley, Albert E., III
2000-01-01
One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.
Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing
Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)
2000-01-01
Geology for allowing us to use C&G as a vehicle to convey how geostatistics and geospatial techniques can be used to analyze remote sensing and other types of spatial data. We see this special issue of C&G. and its complementary issue of PE&RS. as a testament to the vitality and interest in the application of geostatistical and geospatial techniques in remote sensing. We also see these special journal issues as the beginning of a fruitful. and hopefully long-term relationship, between American and British geographers and other researchers interested in geostatistical and geospatial techniques applied to remote sensing and other spatial data.
Applied research of environmental monitoring using instrumental neutron activation analysis
Energy Technology Data Exchange (ETDEWEB)
Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju
1997-08-01
This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.
Hands on applied finite element analysis application with ANSYS
Arslan, Mehmet Ali
2015-01-01
Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.
Geo-statistical analysis of spatial patterns of Batocera davidis larvae%橙斑白条天牛幼虫空间分布的地理统计学分析
Institute of Scientific and Technical Information of China (English)
张思禄
2012-01-01
橙斑白条天牛是杨树的一种新害虫.应用地理统计学分析方法,研究了该虫的空间分布特点.结果表明,橙斑白条天牛幼虫在杨树林间呈聚集分布,种群在样地内具有明显空间依赖性,空间聚集范围在10-13 m之间.根据不同样方天牛幼虫种群的变程,南→北方向的相关距离大于东→西方向,说明天牛幼虫在林间的聚集斑块不是圆形的,而是南→北方向比东→西方向长,南→北方向是种群聚集的主方向,也是种群扩散的主要方向.这为该虫的监测和预测预报提供了科学依据.%Batocera davidis Deyrolle is a new poplar pest. Geostatistical methods were applied to measure and analyzed the spatial pattern of B. Davidis larvae. The results showed that the population spatial pattern displayed an aggregation model with significantly spatial dependence. Spatial range of aggregation fluctuated from 10 to 13 m in the all sampling sites. Based on the variogram model parameters, it was found that the relative distance was more in the orientation from south to north than from east to west. It was suggested that the aggregative spot of this pest population was not rounded, however, it was longer in south-north orientation than in east-west orientation. It could be so concluded that the pest population dispersals and aggregates mainly in south-north orientation. It has offered a scientific basis for the monitoring and prediction of B. Davidis.
Medical Geography: a Promising Field of Application for Geostatistics.
Goovaerts, P
2009-01-01
The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970-1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah.
The application of geostatistics in erosion hazard mapping
Beurden, S.A.H.A. van; Riezebos, H.Th.
1988-01-01
Geostatistical interpolation or kriging of soil and vegetation variables has become an important alternative to other mapping techniques. Although a reconnaissance sampling is necessary and basic requirements of geostatistics have to be met, kriging has the advantage of giving estimates with a minim
DEFF Research Database (Denmark)
Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik;
2010-01-01
the geology of e.g. a contaminated site, it is not always possible to gather enough information to build a representative geological model. Mapping in analogue geological settings and applying geostatistical tools to simulate spatial variability of heterogeneities can improve ordinary geological models...... that are predicated only on vertical borehole information. This study documents methods to map geological heterogeneity in clay till and ways to calibrate geostatistical models with field observations. A well-exposed cross-section in an excavation pit was used to measure and illustrate the occurrence and distribution...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...
Animal Research in the "Journal of Applied Behavior Analysis"
Edwards, Timothy L.; Poling, Alan
2011-01-01
This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…
Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.
Iwata, Brian A.
1987-01-01
The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…
Andrade, A I A S S; Stigter, T Y
2013-04-01
In this study multivariate and geostatistical methods are jointly applied to model the spatial and temporal distribution of arsenic (As) concentrations in shallow groundwater as a function of physicochemical, hydrogeological and land use parameters, as well as to assess the related uncertainty. The study site is located in the Mondego River alluvial body in Central Portugal, where maize, rice and some vegetable crops dominate. In a first analysis scatter plots are used, followed by the application of principal component analysis to two different data matrices, of 112 and 200 samples, with the aim of detecting associations between As levels and other quantitative parameters. In the following phase explanatory models of As are created through factorial regression based on correspondence analysis, integrating both quantitative and qualitative parameters. Finally, these are combined with indicator-geostatistical techniques to create maps indicating the predicted probability of As concentrations in groundwater exceeding the current global drinking water guideline of 10 μg/l. These maps further allow assessing the uncertainty and representativeness of the monitoring network. A clear effect of the redox state on the presence of As is observed, and together with significant correlations with dissolved oxygen, nitrate, sulfate, iron, manganese and alkalinity, points towards the reductive dissolution of Fe (hydr)oxides as the essential mechanism of As release. The association of high As values with rice crop, known to promote reduced environments due to ponding, further corroborates this hypothesis. An additional source of As from fertilizers cannot be excluded, as the correlation with As is higher where rice is associated with vegetables, normally associated with higher fertilization rates. The best explanatory model of As occurrence integrates the parameters season, crop type, well and water depth, nitrate and Eh, though a model without the last two parameters also gives
Applied Thinking for Intelligence Analysis: A Guide for Practitioners
Directory of Open Access Journals (Sweden)
Trista M. Bailey
2015-03-01
Full Text Available Book Review -- Applied Thinking for Intelligence Analysis: A Guide for Practitioners by Charles Vandepeer, PhD, Air Power Development Centre, Department of Defence, Canberra, Australia, 2014, 106 pages, ISBN 13: 9781925062045, Reviewed by Trista M. Bailey
Applying Discourse Analysis in ELT: a Five Cs Model
Institute of Scientific and Technical Information of China (English)
肖巧慧
2009-01-01
Based on a discussion of definitions on Discourse analysis,discourse is regard as layers consist of five elements--cohesion, coherence, culture, critique and context. Moreover, we focus on applying DA in ELT.
Geostatistical sampling optimization and waste characterization of contaminated premises
Energy Technology Data Exchange (ETDEWEB)
Desnoyers, Y.; Jeannee, N. [GEOVARIANCES, 49bis avenue Franklin Roosevelt, BP91, Avon, 77212 (France); Chiles, J.P. [Centre de geostatistique, Ecole des Mines de Paris (France); Dubot, D. [CEA DSV/FAR/USLT/SPRE/SAS (France); Lamadie, F. [CEA DEN/VRH/DTEC/SDTC/LTM (France)
2009-06-15
At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires a radiological assessment of the building structure residual activity. From this point of view, the set up of an appropriate evaluation methodology is of crucial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive) control of the emergent signal is commonly performed using in situ measurement methods such as surface controls combined with in situ gamma spectrometry. Finally, in order to assess the contamination depth, samples are collected at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data allow the definition of a preliminary waste zoning. The exhaustive control of the emergent signal with surface measurements usually leads to inaccurate estimates, because of several factors: varying position of the measuring device, subtraction of an estimate of the background signal, etc. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The initial activity usually presents a spatial continuity within the premises, with preferential contamination of specific areas or existence of activity gradients. Taking into account this spatial continuity is essential to avoid bias while setting up the sampling plan. In such a case, Geostatistics provides methods that integrate the contamination spatial structure. After the characterization of this spatial structure, most probable estimates of the surface activity at un-sampled locations can be derived using kriging techniques. Variants of these techniques also give access to estimates of the uncertainty associated to the spatial
Geostatistical simulations for radon indoor with a nested model including the housing factor.
Cafaro, C; Giovani, C; Garavaglia, M
2016-01-01
The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging.
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
Weiland, E.F.; Connors, R.A.; Robinson, M.L.; Lindemann, J.W.; Meyer, W.T.
1982-01-01
A mineral assessment of the Arkansas Canyon Planning Unit was undertaken by Barringer Resources Inc., under the terms of contract YA-553-CTO-100 with the Bureau of Land Management, Colorado State Office. The study was based on a geochemical-geostatistical survey in which 700 stream sediment samples were collected and analyzed for 25 elements. Geochemical results were interpreted by statistical processing which included factor, discriminant, multiple regression and characteristic analysis. The major deposit types evaluated were massive sulfide-base metal, sedimentary and magmatic uranium, thorium vein, magmatic segregation, and carbonatite related deposits. Results of the single element data and multivariate geostatistical analysis indicate that limited potential exists for base metal mineralization near the Horseshoe, El Plomo, and Green Mountain Mines. Thirty areas are considered to be anomalous with regard to one or more of the geochemical parameters evaluated during this study. The evaluation of carbonatite related mineralization was restricted due to the lack of geochemical data specific to this environment.
Applied data analysis and modeling for energy engineers and scientists
Reddy, T Agami
2011-01-01
""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and
Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.
Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan
2015-01-01
The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.
Park, No-Wook; Jang, Dong-Ho
2014-01-01
This paper compares the predictive performance of different geostatistical kriging algorithms for intertidal surface sediment facies mapping using grain size data. Indicator kriging, which maps facies types from conditional probabilities of predefined facies types, is first considered. In the second approach, grain size fractions are first predicted using cokriging and the facies types are then mapped. As grain size fractions are compositional data, their characteristics should be considered during spatial prediction. For efficient prediction of compositional data, additive log-ratio transformation is applied before cokriging analysis. The predictive performance of cokriging of the transformed variables is compared with that of cokriging of raw fractions in terms of both prediction errors of fractions and facies mapping accuracy. From a case study of the Baramarae tidal flat, Korea, the mapping method based on cokriging of log-ratio transformation of fractions outperformed the one based on cokriging of untransformed fractions in the prediction of fractions and produced the best facies mapping accuracy. Indicator kriging that could not account for the variation of fractions within each facies type showed the worst mapping accuracy. These case study results indicate that the proper processing of grain size fractions as compositional data is important for reliable facies mapping.
Directory of Open Access Journals (Sweden)
No-Wook Park
2014-01-01
Full Text Available This paper compares the predictive performance of different geostatistical kriging algorithms for intertidal surface sediment facies mapping using grain size data. Indicator kriging, which maps facies types from conditional probabilities of predefined facies types, is first considered. In the second approach, grain size fractions are first predicted using cokriging and the facies types are then mapped. As grain size fractions are compositional data, their characteristics should be considered during spatial prediction. For efficient prediction of compositional data, additive log-ratio transformation is applied before cokriging analysis. The predictive performance of cokriging of the transformed variables is compared with that of cokriging of raw fractions in terms of both prediction errors of fractions and facies mapping accuracy. From a case study of the Baramarae tidal flat, Korea, the mapping method based on cokriging of log-ratio transformation of fractions outperformed the one based on cokriging of untransformed fractions in the prediction of fractions and produced the best facies mapping accuracy. Indicator kriging that could not account for the variation of fractions within each facies type showed the worst mapping accuracy. These case study results indicate that the proper processing of grain size fractions as compositional data is important for reliable facies mapping.
Ali, Azizi; Mohd Muslim, Aidy; Lokman Husain, Mohd; Fadzil Akhir, Mohd
2013-04-01
Sea surface temperature (SST) variation provides vital information for weather and ocean forecasting especially when studying climate change. Conventional methods of collecting ocean parameters such as SST, remains expensive and labor intensive due to the large area coverage and complex analytical procedure required. Therefore, some studies need to be conducted on the spatial and temporal distribution of ocean parameters. This study looks at Geo-statisctical methods in interpolating SST values and its impact on accuracy. Two spatial Geo-statistical techniques, mainly kriging and inverse distance functions (IDW) were applied to create variability distribution maps of SST for the Southern South China Sea (SCS). Data from 72 sampling station was collected in July 2012 covering an area of 270 km x 100 km and 263 km away from shore. This data provide the basis for the interpolation and accuracy analysis. After normalization, variograms were computed to fit the data sets producing models with the least RSS value. The accuracy were later evaluated based on on root mean squared error (RMSE) and root mean kriging variance (RMKV). Results show that Kriging with exponential model produced most accuracy estimates, reducing error in 17.3% compared with inverse distance functions.
Negative reinforcement in applied behavior analysis: an emerging technology.
Iwata, B A
1987-01-01
Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...
Geostatistics, remote sensing and precision farming.
Mulla, D J
1997-01-01
Precision farming is possible today because of advances in farming technology, procedures for mapping and interpolating spatial patterns, and geographic information systems for overlaying and interpreting several soil, landscape and crop attributes. The key component of precision farming is the map showing spatial patterns in field characteristics. Obtaining information for this map is often achieved by soil sampling. This approach, however, can be cost-prohibitive for grain crops. Soil sampling strategies can be simplified by use of auxiliary data provided by satellite or aerial photo imagery. This paper describes geostatistical methods for estimating spatial patterns in soil organic matter, soil test phosphorus and wheat grain yield from a combination of Thematic Mapper imaging and soil sampling.
Geostatistical Estimations of Regional Hydraulic Conductivity Fields
Patriarche, D.; Castro, M. C.; Goovaerts, P.
2004-12-01
Direct and indirect measurements of hydraulic conductivity (K) are commonly performed, providing information on the magnitude of this parameter at the local scale (tens of centimeters to hundreds of meters) and at shallow depths. By contrast, field information on hydraulic conductivities at regional scales of tens to hundreds of kilometers and at greater depths is relatively scarce. Geostatistical methods allow for sparsely sampled observations of a variable (primary information) to be complemented by a more densely sampled secondary attribute. Geostatistical estimations of the hydraulic conductivity field in the Carrizo aquifer, a major groundwater flow system extending along Texas, are performed using available primary (e.g., transmissivity, hydraulic conductivity) and secondary (specific capacity) information, for depths up to 2.2 km, and over three regional domains of increasing extent: 1) the domain corresponding to a three-dimensional groundwater flow model previously built (model domain); 2) the area corresponding to the ten counties encompassing the model domain (County domain), and; 3) the full extension of the Carrizo aquifer within Texas (Texas domain). Two different approaches are used: 1) an indirect approach are transmissivity (T) is estimated first and (K) is retrieved through division of the T estimate by the screening length of the wells, and; 2) a direct approach where K data are kriged directly. Prediction performances of the tested geostatistical procedures (kriging combined with linear regression, kriging with known local means, kriging of residuals, and cokriging) are evaluated through cross validation for both log-transformed variables and back-transformed ones. For the indirect approach, kriging of log T residuals yields the best estimates for both log-transformed and back-transformed variables in the model domain. For larger regional scales (County and Texas domains), cokriging performs generally better than univariate kriging procedures
Directory of Open Access Journals (Sweden)
Marco Barra
Full Text Available Geostatistical techniques were applied and a series of spatial indicators were calculated (occupation, aggregation, location, dispersion, spatial autocorrelation and overlap to characterize the spatial distributions of European anchovy and sardine during summer. Two ecosystems were compared for this purpose, both located in the Mediterranean Sea: the Strait of Sicily (upwelling area and the North Aegean Sea (continental shelf area, influenced by freshwater. Although the biomass of anchovy and sardine presented high interannual variability in both areas, the location of the centres of gravity and the main spatial patches of their populations were very similar between years. The size of the patches representing the dominant part of the abundance (80% was mostly ecosystem- and species-specific. Occupation (area of presence appears to be shaped by the extent of suitable habitats in each ecosystem whereas aggregation patterns (how the populations are distributed within the area of presence were species-specific and related to levels of population biomass. In the upwelling area, both species showed consistently higher occupation values compared to the continental shelf area. Certain characteristics of the spatial distribution of sardine (e.g. spreading area, overlapping with anchovy differed substantially between the two ecosystems. Principal component analysis of geostatistical and spatial indicators revealed that biomass was significantly related to a suite of, rather than single, spatial indicators. At the spatial scale of our study, strong correlations emerged between biomass and the first principal component axis with highly positive loadings for occupation, aggregation and patchiness, independently of species and ecosystem. Overlapping between anchovy and sardine increased with the increase of sardine biomass but decreased with the increase of anchovy. This contrasting pattern was attributed to the location of the respective major patches
Simultaneous inversion of petrophysical parameters based on geostatistical a priori information
Institute of Scientific and Technical Information of China (English)
Yin Xing-Yao; Sun Rui-Ying; Wang Bao-Li; Zhang Guang-Zhi
2014-01-01
The high-resolution nonlinear simultaneous inversion of petrophysical parameters is based on Bayesian statistics and combines petrophysics with geostatistical a priori information. We used the fast Fourier transform-moving average (FFT-MA) and gradual deformation method (GDM) to obtain a reasonable variogram by using structural analysis and geostatistical a priori information of petrophysical parameters. Subsequently, we constructed the likelihood function according to the statistical petrophysical model. Finally, we used the Metropolis algorithm to sample the posteriori probability density and complete the inversion of the petrophysical parameters. We used the proposed method to process data from an oil fi eld in China and found good match between inversion and real data with high-resolution. In addition, the direct inversion of petrophysical parameters avoids the error accumulation and decreases the uncertainty, and increases the computational effi ciency.
Spectral analysis and filter theory in applied geophysics
Buttkus, Burkhard
2000-01-01
This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval uated, and instructions provided for their practical application. Be sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob served data, maximum-entropy spectral analysis and maximum-like lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...
Directory of Open Access Journals (Sweden)
Jay Krishna Thakur
2015-08-01
Full Text Available The aim of this work is to investigate new approaches using methods based on statistics and geo-statistics for spatio-temporal optimization of groundwater monitoring networks. The formulated and integrated methods were tested with the groundwater quality data set of Bitterfeld/Wolfen, Germany. Spatially, the monitoring network was optimized using geo-statistical methods. Temporal optimization of the monitoring network was carried out using Sen’s method (1968. For geostatistical network optimization, a geostatistical spatio-temporal algorithm was used to identify redundant wells in 2- and 2.5-D Quaternary and Tertiary aquifers. Influences of interpolation block width, dimension, contaminant association, groundwater flow direction and aquifer homogeneity on statistical and geostatistical methods for monitoring network optimization were analysed. The integrated approach shows 37% and 28% redundancies in the monitoring network in Quaternary aquifer and Tertiary aquifer respectively. The geostatistical method also recommends 41 and 22 new monitoring wells in the Quaternary and Tertiary aquifers respectively. In temporal optimization, an overall optimized sampling interval was recommended in terms of lower quartile (238 days, median quartile (317 days and upper quartile (401 days in the research area of Bitterfeld/Wolfen. Demonstrated methods for improving groundwater monitoring network can be used in real monitoring network optimization with due consideration given to influencing factors.
Applying Frequency Map Analysis to the Australian Synchrotron Storage Ring
Tan, Yaw-Ren E; Le Blanc, Gregory Scott
2005-01-01
The technique of frequency map analysis has been applied to study the transverse dynamic aperture of the Australian Synchrotron Storage Ring. The results have been used to set the strengths of sextupoles to optimise the dynamic aperture. The effects of the allowed harmonics in the quadrupoles and dipole edge effects are discussed.
Research in applied mathematics, numerical analysis, and computer science
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
APPLYING OF GAS ANALYSIS IN DIAGNOSIS OF BRONCHOPULMONARY DISEASES
Directory of Open Access Journals (Sweden)
Ye. B. Bukreyeva
2014-01-01
Full Text Available Bronchopulmonary system diseases are on the first place among the causes of people's death. Most of methods for lung diseases diagnosis are invasive or not suitable for children and patients with severe disease. One of the promising methods of clinical diagnosis and disease activity monitoring of bronchopulmonary system is analyzing of human breath. Directly exhaled breath or exhaled breath condensate are using for human breaths analyzing. Analysis of human breath can apply for diagnostic, long monitoring and evaluation of efficacy of the treatment bronchopulmonary diseases. Differential diagnostic between chronic obstructive lung disease (COPD and bronchial asthma is complicated because they have differences in pathogenesis. Analysis of human breath allows to explore features of COPD and bronchial asthma and to improve differential diagnostic of these diseases. Human breaths analyzing can apply for diagnostic dangerous diseases, such as tuberculosis, lung cancer. The analysis of breath air by spectroscopy methods is new noninvasive way for diagnosis of bronchopulmonary diseases.
Directory of Open Access Journals (Sweden)
Laura Grisotto
2016-04-01
Full Text Available In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.
Grisotto, Laura; Consonni, Dario; Cecconi, Lorenzo; Catelan, Dolores; Lagazio, Corrado; Bertazzi, Pier Alberto; Baccini, Michela; Biggeri, Annibale
2016-04-18
In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy) and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.
The Split-Apply-Combine Strategy for Data Analysis
Directory of Open Access Journals (Sweden)
Hadley Wickham
2011-04-01
Full Text Available Many data analysis problems involve the application of a split-apply-combine strategy, where you break up a big problem into manageable pieces, operate on each piece independently and then put all the pieces back together. This insight gives rise to a new R package that allows you to smoothly apply this strategy, without having to worry about the type of structure in which your data is stored.The paper includes two case studies showing how these insights make it easier to work with batting records for veteran baseball players and a large 3d array of spatio-temporal ozone measurements.
Razack, Moumtaz; Lasm, Théophile
2006-06-01
This work is aimed at estimating the transmissivity of highly fractured hard rock aquifers using a geostatistical approach. The studied aquifer is formed by the crystalline and metamorphic rocks of the Western Ivory Coast (West Africa), in the Man Danané area. The study area covers 7290 km 2 (90 km×81 km). The fracturing network is dense and well connected, without a marked fracture direction. A data base comprising 118 transmissivity ( T) values and 154 specific capacity ( Q/ s) values was compiled. A significant empirical relationship between T and Q/ s was found, which enabled the transmissivity data to be supplemented. The variographic analysis of the two variables showed that the variograms of T and Q/ s (which are lognormal variables) are much more structured than those of log T and log Q/ s (which are normal variables). This result is contrary to what was previously published and raises the question whether normality is necessary in geostatistical analysis. Several input and geostatistical estimations of the transmissivity were tested using the cross validation procedure: measured transmissivity data; supplemented transmissivity data; kriging; cokriging. The cross validation results showed that the best estimation is provided using the kriging procedure, the transmissivity field represented by the whole data sample (measured+estimated using specific capacity) and the structural model evaluated solely on the measured transmissivity. The geostatistical approach provided in fine a reliable estimation of the transmissivity of the Man Danané aquifer, which will be used as an input in forthcoming modelling.
Bayesian geostatistical modeling of leishmaniasis incidence in Brazil.
Directory of Open Access Journals (Sweden)
Dimitrios-Alexios Karagiannis-Voules
Full Text Available BACKGROUND: Leishmaniasis is endemic in 98 countries with an estimated 350 million people at risk and approximately 2 million cases annually. Brazil is one of the most severely affected countries. METHODOLOGY: We applied Bayesian geostatistical negative binomial models to analyze reported incidence data of cutaneous and visceral leishmaniasis in Brazil covering a 10-year period (2001-2010. Particular emphasis was placed on spatial and temporal patterns. The models were fitted using integrated nested Laplace approximations to perform fast approximate Bayesian inference. Bayesian variable selection was employed to determine the most important climatic, environmental, and socioeconomic predictors of cutaneous and visceral leishmaniasis. PRINCIPAL FINDINGS: For both types of leishmaniasis, precipitation and socioeconomic proxies were identified as important risk factors. The predicted number of cases in 2010 were 30,189 (standard deviation [SD]: 7,676 for cutaneous leishmaniasis and 4,889 (SD: 288 for visceral leishmaniasis. Our risk maps predicted the highest numbers of infected people in the states of Minas Gerais and Pará for visceral and cutaneous leishmaniasis, respectively. CONCLUSIONS/SIGNIFICANCE: Our spatially explicit, high-resolution incidence maps identified priority areas where leishmaniasis control efforts should be targeted with the ultimate goal to reduce disease incidence.
Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description
2011-01-01
Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar traum...
An applied ethics analysis of best practice tourism entrepreneurs
2015-01-01
Ethical entrepreneurship and by extension wider best practice are noble goals for the future of tourism. However, questions arise which concepts, such as values motivations, actions and challenges underpin these goals. This thesis seeks to answers these questions and in so doing develop an applied ethics analysis for best practice entrepreneurs in tourism. The research is situated in sustainable tourism, which is ethically very complex and has thus far been dominated by the economic, social a...
Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling
Herrmann, Robert A.
2003-01-01
This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...
Magnetic Solid Phase Extraction Applied to Food Analysis
Directory of Open Access Journals (Sweden)
Israel S. Ibarra
2015-01-01
Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.
High Performance Geostatistical Modeling of Biospheric Resources
Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.
2004-12-01
We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.
Harmonic and applied analysis from groups to signals
Mari, Filippo; Grohs, Philipp; Labate, Demetrio
2015-01-01
This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Differential item functioning analysis by applying multiple comparison procedures.
Eusebi, Paolo; Kreiner, Svend
2015-01-01
Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.
Classical mechanics approach applied to analysis of genetic oscillators.
Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha
2016-04-05
Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.
Geostatistical inference using crosshole ground-penetrating radar
DEFF Research Database (Denmark)
Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou
2010-01-01
, the moisture content will reflect the variation of the physical properties of the subsurface, which determine the flow patterns in the unsaturated zone. Deterministic least-squares inversion of crosshole groundpenetrating-radar GPR traveltimes result in smooth, minimumvariance estimates of the subsurface radar...... wave velocity structure, which may diminish the utility of these images for geostatistical inference. We have used a linearized stochastic inversion technique to infer the geostatistical properties of the subsurface radar wave velocity distribution using crosshole GPR traveltimes directly. Expanding...
Shape analysis applied in heavy ion reactions near Fermi energy
Zhang, S.; Huang, M.; Wada, R.; Liu, X.; Lin, W.; Wang, J.
2017-03-01
A new method is proposed to perform shape analyses and to evaluate their validity in heavy ion collisions near the Fermi energy. In order to avoid erroneous values of shape parameters in the calculation, a test particle method is utilized in which each nucleon is represented by n test particles, similar to that used in the Boltzmann–Uehling–Uhlenbeck (BUU) calculations. The method is applied to the events simulated by an antisymmetrized molecular dynamics model. The geometrical shape of fragments is reasonably extracted when n = 100 is used. A significant deformation is observed for all fragments created in the multifragmentation process. The method is also applied to the shape of the momentum distribution for event classification. In the momentum case, the errors in the eigenvalue calculation become much smaller than those of the geometrical shape analysis and the results become similar between those with and without the test particle method, indicating that in intermediate heavy ion collisions the shape analysis of momentum distribution can be used for the event classification without the test particle method.
Automated SEM Modal Analysis Applied to the Diogenites
Bowman, L. E.; Spilde, M. N.; Papike, James J.
1996-01-01
Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.
Institute of Scientific and Technical Information of China (English)
Dou Lei; Zhou Yongzhang; Ma Jin; Li Yong; Cheng Qiuming; Xie Shuyun; Du Haiyan; You Yuanhang; Wan Hongfu
2008-01-01
Dongguan (东莞) City, located in the Pearl River Delta, South China, is famous for its rapid industrialization in the past 30 years. A total of 90 topsoil samples have been collected from agricultural fields, including vegetable and orchard soils in the city, and eight heavy metals (As, Cu, Cd,Cr, Hg, Ni, Pb, and Zn) and other items (pH values and organic matter) have been analyzed, to evaluate the influence of anthropie activities on the environmental quality of agricultural soils and to identify the spatial distribution of trace elements and possible sources of trace elements. The elements Hg, Pb, and Cd have accumulated remarkably here, incomparison with the soil background content of elements in Guangdong (广东) Province. Pollution is more serious in the western plain and the central region, which are heavily distributed with industries and rivers. Multivariate and geostatistical methods have been applied to differentiate the influences of natural processes and human activities on the pollution of heavy metals in topsoils in the study area. The results of cluster analysis (CA) and factor analysis (FA) show that Ni, Cr, Cu, Zn, and As are grouped in factor F1,Pb in F2, and Cd and Hg in F3, respectively. The spatial pattern of the three factors may be well demonstrated by geostatistical analysis. It is shown that the first factor could be considered as a natural source controlled by parent rocks. The second factor could be referred to as "industrial and traffic pollution sources". The source of the third factor is mainly controlled by long-term anthropic activities ,ad a consequence of agricultural fossil fuel consumption and atmospheric deposition.
Patch-based iterative conditional geostatistical simulation using graph cuts
Li, Xue; Mariethoz, Gregoire; Lu, DeTang; Linde, Niklas
2016-08-01
Training image-based geostatistical methods are increasingly popular in groundwater hydrology even if existing algorithms present limitations that often make real-world applications difficult. These limitations include a computational cost that can be prohibitive for high-resolution 3-D applications, the presence of visual artifacts in the model realizations, and a low variability between model realizations due to the limited pool of patterns available in a finite-size training image. In this paper, we address these issues by proposing an iterative patch-based algorithm which adapts a graph cuts methodology that is widely used in computer graphics. Our adapted graph cuts method optimally cuts patches of pixel values borrowed from the training image and assembles them successively, each time accounting for the information of previously stitched patches. The initial simulation result might display artifacts, which are identified as regions of high cost. These artifacts are reduced by iteratively placing new patches in high-cost regions. In contrast to most patch-based algorithms, the proposed scheme can also efficiently address point conditioning. An advantage of the method is that the cut process results in the creation of new patterns that are not present in the training image, thereby increasing pattern variability. To quantify this effect, a new measure of variability is developed, the merging index, quantifies the pattern variability in the realizations with respect to the training image. A series of sensitivity analyses demonstrates the stability of the proposed graph cuts approach, which produces satisfying simulations for a wide range of parameters values. Applications to 2-D and 3-D cases are compared to state-of-the-art multiple-point methods. The results show that the proposed approach obtains significant speedups and increases variability between realizations. Connectivity functions applied to 2-D models transport simulations in 3-D models are used to
Finite element analysis applied to dentoalveolar trauma: methodology description.
da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W
2011-01-01
Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated.
Comparison Between two FMEA Analysis Applied to Dairy
Directory of Open Access Journals (Sweden)
Alexandre de Paula Peres
2010-06-01
Full Text Available The FMEA (Failure Mode and Effect Analysis is a methodology that has been used in environmental risk assessment during the production process. Although the environmental certification means strengthening corporate image and ensuring their stay in the market, it is still very costly, particularly for small and medium businesses. Given this, the FMEA can be a benchmark for companies to start to diagnose the environmental risk caused by them. This methodology was used to diagnose differences in environmental concern and environmental controls exercised in two dairy plants from Lavras. By applying this method, one can observe different applications on the tables found in business: diagnosis and confirmation of the risks of controls taken.
Applied research and development of neutron activation analysis
Energy Technology Data Exchange (ETDEWEB)
Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun
2000-05-01
This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.
Temporal Fourier analysis applied to equilibrium radionuclide cineangiography
Energy Technology Data Exchange (ETDEWEB)
Cardot, J.C.; Verdenet, J.; Bidet, A.; Bidet, R.; Berthout, P.; Faivre, R.; Bassand, J.P.; Maurat, J.P.
1982-08-01
Regional and global left ventricular wall motion was assessed in 120 patients using radionulcide cincangiography (RCA) and contrast angiography. Functional imaging procedures based on a temporal Fourier analysis of dynamic image sequences were applied to the study of cardiac contractility. Two images were constructed by taking the phase and amplitude values of the first harmonic in the Fourier transform for each pixel. These two images aided in determining the perimeter of the left ventricle to calculate the global ejection fraction. Regional left ventricular wall motion was studied by analyzing the phase value and by examining the distribution histogram of these values. The accuracy of global ejection fraction calculation was improved by the Fourier technique. This technique increased the sensitivity of RCA for determining segmental abnormalities especially in the left anterior oblique view (LAO).
Downside Risk analysis applied to Hedge Funds universe
Perello, J
2006-01-01
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.
Geostatistical Modeling of Evolving Landscapes by Means of Image Quilting
Mendes, J. H.; Caers, J.; Scheidt, C.
2015-12-01
Realistic geological representation of subsurface heterogeneity remains an important outstanding challenge. While many geostatistical methods exist for representing sedimentary systems, such as multiple-point geostatistics, rule-based methods or Boolean methods, the question of what the prior uncertainty on parameters (or training images) of such algorithms are, remains outstanding. In this initial work, we investigate the use of flume experiments to constrain better such prior uncertainty and to start understanding what information should be provided to geostatistical algorithms. In particular, we study the use of image quilting as a novel multiple-point method for generating fast geostatistical realizations once a training image is provided. Image quilting is a method emanating from computer graphics where patterns are extracted from training images and then stochastically quilted along a raster path to create stochastic variation of the stated training image. In this initial study, we use a flume experiment and extract 10 training images as representative for the variability of the evolving landscape over a period of 136 minutes. The training images consists of wet/dry regions obtained from overhead shots taken over the flume experiment. To investigate whether such image quilting reproduces the same variability of the evolving landscape in terms of wet/dry regions, we generate multiple realizations with all 10 training images and compare that variability with the variability seen in the entire flume experiment. By proper tuning of the quilting parameters we find generally reasonable agreement with the flume experiment.
Gstat: a program for geostatistical modelling, prediction and simulation
Pebesma, Edzer J.; Wesseling, Cees G.
1998-01-01
Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.
Reducing uncertainty in geostatistical description with well testing pressure data
Energy Technology Data Exchange (ETDEWEB)
Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)
1997-08-01
Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.
Directory of Open Access Journals (Sweden)
Célia Regina Grego
2006-08-01
Full Text Available Experiments in agriculture usually consider the topsoil properties to be uniform in space and, for this reason, often make inadequate use of the results. The objective of this study was to assess the variability for soil moisture content using geostatistical techniques. The experiment was carried out on a Rhodic Ferralsol (typic Haplorthox in Campinas, SP, Brazil, in an area of 3.42 ha cultivated under the no tillage system, and the sampling was made in a grid of 102 points spaced 10 m x 20 m. Access tubes were inserted down to one meter at each evaluation point in order to measure soil moisture contents (cm³ cm-3 at depths of 30, 60 and 90 cm with a neutron moisture gauge. Samplings were made between the months of August and September of 2003 and in January 2004. The soil moisture content for each sampling date was analyzed using classical statistics in order to appropriately describe the central tendency and dispersion on the data and then using geostatistics to describe the spatial variability. The comparison between the spatial variability for different samplings was made examining scaled semivariograms. Water content was mapped using interpolated values with punctual kriging. The semivariograms showed that, at the 60 cm depth, soil water content had moderate spatial dependence with ranges between 90 and 110 m. However, no spatial dependence was found for 30 and 90 cm depths in 2003. Sampling density was insufficient for an adequate characterization of the spatial variability of soil moisture contents at the 30 and 90 cm depths.Experimentos em agricultura geralmente consideram as propriedades do solo como sendo uniformes no espaço e, por esta razão, os resultados são freqüentemente mal interpretados. O objetivo deste estudo foi avaliar a variabilidade do teor de água do solo usando técnicas de geoestatística. O experimento foi desenvolvido em um Latossolo Vermelho eutroférrico, Campinas, SP, Brasil, numa área de 3,42 ha sob plantio
Multivariate Statistical Analysis Applied in Wine Quality Evaluation
Directory of Open Access Journals (Sweden)
Jieling Zou
2015-08-01
Full Text Available This study applies multivariate statistical approaches to wine quality evaluation. With 27 red wine samples, four factors were identified out of 12 parameters by principal component analysis, explaining 89.06% of the total variance of data. As iterative weights calculated by the BP neural network revealed little difference from weights determined by information entropy method, the latter was chosen to measure the importance of indicators. Weighted cluster analysis performs well in classifying the sample group further into two sub-clusters. The second cluster of red wine samples, compared with its first, was lighter in color, tasted thinner and had fainter bouquet. Weighted TOPSIS method was used to evaluate the quality of wine in each sub-cluster. With scores obtained, each sub-cluster was divided into three grades. On the whole, the quality of lighter red wine was slightly better than the darker category. This study shows the necessity and usefulness of multivariate statistical techniques in both wine quality evaluation and parameter selection.
To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits
von Hippel, Ted
2015-01-01
We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants a...
Multiple Point Geostatistics for automated landform mapping
Karssenberg, D.; Vannametee, E.; Babel, L.; Schuur, J.; Hendriks, M.; Bierkens, M. F.
2011-12-01
Land-surface processes are often studied at the level of elementary landform units, e.g. geomorphological units. To avoid expensive and difficult field surveys and to ensure a consistent mapping scheme, automated derivation of these units is desirable. However, automated classification based on two-point statistics of topographical attributes (e.g. semivarigram) is inadequate in reproducing complex, curvilinear landform patterns. Therefore, the spatial structure and configuration of terrain characteristics suitable for landform classification should be based on statistics from multiple points. In this study, a generic automated landform classification routine is developed which is based on Multiple Point Geostatistics (MPG) using information from a field map of geomorphology in a training area and a gridded Digital Elevation Model (DEM). Focus is on classification of geomorphologic units; e.g. alluvial fan, river terrace. The approach is evaluated using data from the French Alps. In the first procedural step, spatial statistics of the geomorphologic units are retrieved from a training data set, consisting of a digital elevation model and a geomorphologic map, created using field observations and 37.5 x 37.5 m2 cells. For each grid cell in the training data set, the geomorphological unit of the grid cell and a set of topographical attributes (i.e. a pattern) of the grid cell is stored in a frequency database. The set of topographical attributes stored is chosen such that it represents criteria used in field mapping. These are, for instance, topographical slope gradient, upstream area, or geomorphological units mapped in the neighborhood of the cell. Continuous information (e.g. slope) is converted to categorical data (slope class), which is required in the MPG approach. The second step is to use the knowledge stored in the frequency database for mapping. The algorithm reads a set of attribute classes from a classification target cell and its surrounding cells taking
Energy Technology Data Exchange (ETDEWEB)
Kara G. Eby
2010-08-01
At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model that predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.
Jha, Sanjeev Kumar
2013-01-01
A downscaling approach based on multiple-point geostatistics (MPS) is presented. The key concept underlying MPS is to sample spatial patterns from within training images, which can then be used in characterizing the relationship between different variables across multiple scales. The approach is used here to downscale climate variables including skin surface temperature (TSK), soil moisture (SMOIS), and latent heat flux (LH). The performance of the approach is assessed by applying it to data derived from a regional climate model of the Murray-Darling basin in southeast Australia, using model outputs at two spatial resolutions of 50 and 10 km. The data used in this study cover the period from 1985 to 2006, with 1985 to 2005 used for generating the training images that define the relationships of the variables across the different spatial scales. Subsequently, the spatial distributions for the variables in the year 2006 are determined at 10 km resolution using the 50 km resolution data as input. The MPS geostatistical downscaling approach reproduces the spatial distribution of TSK, SMOIS, and LH at 10 km resolution with the correct spatial patterns over different seasons, while providing uncertainty estimates through the use of multiple realizations. The technique has the potential to not only bridge issues of spatial resolution in regional and global climate model simulations but also in feature sharpening in remote sensing applications through image fusion, filling gaps in spatial data, evaluating downscaled variables with available remote sensing images, and aggregating/disaggregating hydrological and groundwater variables for catchment studies.
To apply or not to apply: a survey analysis of grant writing costs and benefits.
von Hippel, Ted; von Hippel, Courtney
2015-01-01
We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.
To apply or not to apply: a survey analysis of grant writing costs and benefits.
Directory of Open Access Journals (Sweden)
Ted von Hippel
Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.
Correlation network analysis applied to complex biofilm communities.
Directory of Open Access Journals (Sweden)
Ana E Duran-Pinedo
Full Text Available The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM, which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063. After two rounds of enrichment by a selected helper (Prevotella oris OT311 we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of
Handbook of Systems Analysis: Volume 1. Overview. Chapter 2. The Genesis of Applied Systems Analysis
1981-01-01
The International Institute for Applied Systems Analysis is preparing a Handbook of Systems Analysis, which will appear in three volumes: Volume 1: Overview is aimed at a widely varied audience of producers and users of systems analysis studies. Volume 2: Methods is aimed at systems analysts and other members of systems analysis teams who need basic knowledge of methods in which they are not expert; this volume contains introductory overviews of such methods. Volume 3: Cases co...
Energy Technology Data Exchange (ETDEWEB)
Zhong, Buqing; Liang, Tao, E-mail: liangt@igsnrr.ac.cn; Wang, Lingqing; Li, Kexin
2014-08-15
An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. - Highlights: • Conditional inference tree can identify variables controlling metal distribution. • Finite mixture distribution model can partition natural and anthropogenic sources. • Geostatistics with stochastic models
SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT
Directory of Open Access Journals (Sweden)
Cassio C. Montenegro Duarte
2012-05-01
Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.
Applied genre analysis: a multi-perspective model
Directory of Open Access Journals (Sweden)
Vijay K Bhatia
2002-04-01
Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.
Applying DNA computation to intractable problems in social network analysis.
Chen, Rick C S; Yang, Stephen J H
2010-09-01
From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.
Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace
Directory of Open Access Journals (Sweden)
Jianwen Sun
2012-02-01
Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.
Geostatistics: a common link between medical geography, mathematical geology, and medical geology.
Goovaerts, P
2014-08-01
Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviours, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentration across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level.
Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula
2016-01-01
This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.
Stochastic Local Interaction (SLI) model: Bridging machine learning and geostatistics
Hristopulos, Dionissios T.
2015-12-01
Machine learning and geostatistics are powerful mathematical frameworks for modeling spatial data. Both approaches, however, suffer from poor scaling of the required computational resources for large data applications. We present the Stochastic Local Interaction (SLI) model, which employs a local representation to improve computational efficiency. SLI combines geostatistics and machine learning with ideas from statistical physics and computational geometry. It is based on a joint probability density function defined by an energy functional which involves local interactions implemented by means of kernel functions with adaptive local kernel bandwidths. SLI is expressed in terms of an explicit, typically sparse, precision (inverse covariance) matrix. This representation leads to a semi-analytical expression for interpolation (prediction), which is valid in any number of dimensions and avoids the computationally costly covariance matrix inversion.
Assessing the resolution-dependent utility of tomograms for geostatistics
Day-Lewis, F. D.; Lane, J.W.
2004-01-01
Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.
4th European Conference on Geostatistics for Environmental Applications
Carrera, Jesus; Gómez-Hernández, José
2004-01-01
The fourth edition of the European Conference on Geostatistics for Environmental Applications (geoENV IV) took place in Barcelona, November 27-29, 2002. As a proof that there is an increasing interest in environmental issues in the geostatistical community, the conference attracted over 100 participants, mostly Europeans (up to 10 European countries were represented), but also from other countries in the world. Only 46 contributions, selected out of around 100 submitted papers, were invited to be presented orally during the conference. Additionally 30 authors were invited to present their work in poster format during a special session. All oral and poster contributors were invited to submit their work to be considered for publication in this Kluwer series. All papers underwent a reviewing process, which consisted on two reviewers for oral presentations and one reviewer for posters. The book opens with one keynote paper by Philippe Naveau. It is followed by 40 papers that correspond to those presented orally d...
Energy Technology Data Exchange (ETDEWEB)
Paz, C. de la
2013-02-01
The main objective of this research is the development of a location methodology for sitting optimization of small hydro power (SHP) centrals. In order of achieve this goal, a Multi-Criteria Evaluation (MCE) methodology implemented through the use of tools in a GIS environment: Spatial Analysis, Geostatistic Analysis, and Hydrology have been developed. This methodology includes two different models based on the same MCE process. The substantial difference of both models is in the input data and the tools applied to estimate the energy resource and the principal factor of the methodology (caudal or accumulated flow). The first model is generated from caudal data obtained in the study area (El Bierzo), and the second one from pluviometric data and Digital Terrain Model (DTM). Both models include viability maps with greater ability areas to locate SHP facilities. As an additional objective, the study allows contrasting the results of the two developed models to evaluate their similarity. (Author)
Energy Technology Data Exchange (ETDEWEB)
Maschio, Celio; Schiozer, Denis Jose [Universidade Estadual de Campinas (FEM/UNICAMP), SP (Brazil). Faculdade de Engenharia Mecanica], Emails: celio@dep.fem.unicamp.br, denis@dep.fem.unicamp.br; Vidal, Alexandre Campane [Universidade Estadual de Campinas (IG/UNICAMP), SP (Brazil). Inst. de Geociencias. Dept. de Geologia e Recursos Naturais], E-mail: vidal@ige.unicamp.br
2008-03-15
The production history matching process, by which the numerical model is calibrated in order to reproduce the observed field production, is normally carried out separately from the geological modeling. Generally, the construction of the geological model and the history matching process are performed by different teams, such is common uncoupling or a weak coupling between the two areas. This can lead, in the history matching step, inadequate changes in the geological model, resulting sometimes models geologically inconsistent. This work proposes integration between the geostatistical modeling and the history matching through the incorporation of geostatistical realizations into the assisted process. In this way, reservoir parameters such as rock-fluid interaction properties, as well as the images resulted from the realizations are considered in the history matching. In order to find the best parameters combination that adjusts the model to the observed data, an optimization routine based on genetic algorithm is used. The proposed methodology is applied to a synthetic realistic reservoir model. The history matching is carried out in the conventional manner and considering the geostatistical images as history parameters, such the two processes are posteriorly compared. The results show the feasibility and the advantages resulting of this process of integration between the history matching and geostatistical modeling. (author)
Geostatistical borehole image-based mapping of karst-carbonate aquifer pores
Michael Sukop,; Cunningham, Kevin J.
2016-01-01
Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.
Mota-Hernandez, Cinthya; Alvarado-Corona, Rafael
2014-01-01
Tectonic earthquakes of high magnitude can cause considerable losses in terms of human lives, economic and infrastructure, among others. According to an evaluation published by the U.S. Geological Survey, 30 is the number of earthquakes which have greatly impacted Mexico from the end of the XIX century to this one. Based upon data from the National Seismological Service, on the period between January 1, 2006 and May 1, 2013 there have occurred 5,826 earthquakes which magnitude has been greater than 4.0 degrees on the Richter magnitude scale (25.54% of the total of earthquakes registered on the national territory), being the Pacific Plate and the Cocos Plate the most important ones. This document describes the development of an Artificial Neural Network (ANN) based on the radial topology which seeks to generate a prediction with an error margin lower than 20% which can inform about the probability of a future earthquake one of the main questions is: can artificial neural networks be applied in seismic forecast...
Geostatistical inference using crosshole ground-penetrating radar
DEFF Research Database (Denmark)
Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou;
2010-01-01
High-resolution tomographic images obtained from crosshole geophysical measurements have the potential to provide valuable information about the geostatistical properties of unsaturated-zone hydrologic-state va riables such as moisture content. Under drained or quasi-steady-state conditions, the ...... reflection profile. Furthermore, the inferred values of the subsurface global variance and the mean velocity have been corroborated with moisturecontent measurements, obtained gravimetrically from samples collected at the field site....
Mapping malaria risk in Bangladesh using Bayesian geostatistical models.
Reid, Heidi; Haque, Ubydul; Clements, Archie C A; Tatem, Andrew J; Vallely, Andrew; Ahmed, Syed Masud; Islam, Akramul; Haque, Rashidul
2010-10-01
Background malaria-control programs are increasingly dependent on accurate risk maps to effectively guide the allocation of interventions and resources. Advances in model-based geostatistics and geographical information systems (GIS) have enabled researchers to better understand factors affecting malaria transmission and thus, more accurately determine the limits of malaria transmission globally and nationally. Here, we construct Plasmodium falciparum risk maps for Bangladesh for 2007 at a scale enabling the malaria-control bodies to more accurately define the needs of the program. A comprehensive malaria-prevalence survey (N = 9,750 individuals; N = 354 communities) was carried out in 2007 across the regions of Bangladesh known to be endemic for malaria. Data were corrected to a standard age range of 2 to less than 10 years. Bayesian geostatistical logistic regression models with environmental covariates were used to predict P. falciparum prevalence for 2- to 10-year-old children (PfPR(2-10)) across the endemic areas of Bangladesh. The predictions were combined with gridded population data to estimate the number of individuals living in different endemicity classes. Across the endemic areas, the average PfPR(2-10) was 3.8%. Environmental variables selected for prediction were vegetation cover, minimum temperature, and elevation. Model validation statistics revealed that the final Bayesian geostatistical model had good predictive ability. Risk maps generated from the model showed a heterogeneous distribution of PfPR(2-10) ranging from 0.5% to 50%; 3.1 million people were estimated to be living in areas with a PfPR(2-10) greater than 1%. Contemporary GIS and model-based geostatistics can be used to interpolate malaria risk in Bangladesh. Importantly, malaria risk was found to be highly varied across the endemic regions, necessitating the targeting of resources to reduce the burden in these areas.
Abdu, Hiruy
Knowledge of the spatial distribution of soil textural properties at the watershed scale is important for understanding spatial patterns of water movement, and in determining soil moisture storage and soil hydraulic transport properties. Capturing the heterogeneous nature of the subsurface without exhaustive and costly sampling presents a significant challenge. Soil scientists and geologists have adapted geophysical methods that measure a surrogate property related to the vital underlying process. Apparent electrical conductivity (ECa) is such a proxy, providing a measure of charge mobility due to application of an electric field, and is highly correlated to the electrical conductivity of the soil solution, clay percentage, and water content. Electromagnetic induction (EMI) provides the possibility of obtaining high resolution images of ECa across a landscape to identify subtle changes in subsurface properties. The aim of this study was to better characterize subsurface textural properties using EMI mapping and geostatistical analysis techniques. The effect of variable temperature environments on EMI instrumental response, and EC a -- depth relationship were first determined. Then a procedure of repeated EMI mapping at varying soil water content was developed and integrated with temporal stability analysis to capture the time invariant properties of spatial soil texture on an agricultural field. In addition, an EMI imaging approach of densely sampling the subsurface of the Reynolds Mountain East watershed was presented using kriging to interpolate, and Sequential Gaussian Simulation to estimate the uncertainty in the maps. Due to the relative time-invariant characteristics of textural properties, it was possible to correlate clay samples collected over three seasons to ECa data of one mapping event. Kriging methods [ordinary kriging (OK), cokriging (CK), and regression kriging (RK)] were then used to integrate various levels of information (clay percentage, ECa
A reservoir skeleton-based multiple point geostatistics method
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
Traditional stochastic reservoir modeling,including object-based and pixel-based methods,cannot solve the problem of reproducing continuous and curvilinear reservoir objects. The paper first dives into the various stochastic modeling methods and extracts their merits,then proposes the skeleton-based multiple point geostatistics(SMPS) for the fluvial reservoir. The core idea is using the skeletons of reservoir objects to restrict the selection of data patterns. The skeleton-based multiple point geostatistics consists of two steps. First,predicting the channel skeleton(namely,channel centerline) by using the method in object-based modeling. The paper proposes a new method of search window to predict the skeleton. Then forecasting the distributions of reservoir objects using multiple point geostatistics with the restriction of channel skeleton. By the restriction of channel centerline,the selection of data events will be more reasonable and the realization will be achieved more really. The checks by the conceptual model and the real reservoir show that SMPS is much better than Sisim(sequential indicator simulation) ,Snesim(Single Normal Equation Simulation) and Simpat(simulation with patterns) in building the fluvial reservoir model. This new method will contribute to both the theoretical research of stochastic modeling and the oilfield developments of constructing highly precise reservoir geological models.
Breast carcinoma, intratumour heterogeneity and histological grading, using geostatistics.
Sharifi-Salamatian, V; de Roquancourt, A; Rigaut, J P
2000-01-01
Tumour progression is currently believed to result from genetic instability. Chromosomal patterns specific of a type of cancer are frequent even though phenotypic spatial heterogeneity is omnipresent. The latter is the usual cause of histological grading imprecision, a well documented problem, without any fully satisfactory solution up to now. The present article addresses this problem in breast carcinoma. The assessment of a genetic marker for human tumours requires quantifiable measures of intratumoral heterogeneity. If any invariance paradigm representing a stochastic or geostatistic function could be discovered, this might help in solving the grading problem. A novel methodological approach using geostatistics to measure heterogeneity is used. Twenty tumours from the three usual (Scarff-Bloom and Richardson) grades were obtained and paraffin sections stained by MIB-1 (Ki-67) and peroxidase staining. Whole two-dimensional sections were sampled. Morphometric grids of variable sizes allowed a simple and fast recording of positions of epithelial nuclei, marked or not by MIB-1. The geostatistical method is based here upon the asymptotic behaviour of dispersion variance. Measure of asymptotic exponent of dispersion variance shows an increase from grade 1 to grade 3. Preliminary results are encouraging: grades 1 and 3 on one hand and 2 and 3 on the other hand are totally separated. The final proof of an improved grading using this measure will of course require a confrontation with the results of survival studies.
Geostatistical regionalization of low-flow indices: PSBI and Top-Kriging
Directory of Open Access Journals (Sweden)
S. Castiglioni
2010-09-01
Full Text Available Recent studies highlight that geostatistical interpolation, which has been originally developed for the spatial interpolation of point data, can be effectively applied to the problem of regionalization of hydrometric information. This study compares two innovative geostatistical approaches for the prediction of low-flows in ungauged basins. The first one, named Physiographic-Space Based Interpolation (PSBI, performs the spatial interpolation of the desired streamflow index (e.g., annual streamflow, low-flow index, flood quantile, etc. in the space of catchment descriptors. The second technique, named Topological kriging or Top-Kriging, predicts the variable of interest along river networks taking both the area and nested nature of catchments into account. PSBI and Top-Kriging are applied for the regionalization of Q_{355} (i.e., the streamflow that is equalled or exceeded 355 days in a year, on average over a broad geographical region in central Italy, which contains 51 gauged catchments. Both techniques are cross-validated through a leave-one-out procedure at all available gauges and applied to a subregion to produce a continuous estimation of Q_{355} along the river network extracted from a 90 m DEM. The results of the study show that Top-Kriging and PSBI present complementary features and have comparable performances (Nash-Sutcliffe efficiencies in cross-validation of 0.89 and 0.83, respectively. Both techniques provide plausible and accurate predictions of Q_{355} in ungauged basins and represent promising opportunities for regionalization of low-flows.
Cognitive task analysis: Techniques applied to airborne weapons training
Energy Technology Data Exchange (ETDEWEB)
Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))
1989-01-01
This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.
Empirical modeling and data analysis for engineers and applied scientists
Pardo, Scott A
2016-01-01
This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...
Energy Technology Data Exchange (ETDEWEB)
Costa Reis, L.
2001-01-01
We have developed in this thesis a methodology of integrated characterization of heterogeneous reservoirs, from geologic modeling to history matching. This methodology is applied to the reservoir PBR, situated in Campos Basin, offshore Brazil, which has been producing since June 1979. This work is an extension of two other thesis concerning geologic and geostatistical modeling of the reservoir PBR from well data and seismic information. We extended the geostatistical litho-type model to the whole reservoir by using a particular approach of the non-stationary truncated Gaussian simulation method. This approach facilitated the application of the gradual deformation method to history matching. The main stages of the methodology for dynamic data integration in a geostatistical reservoir model are presented. We constructed a reservoir model and the initial difficulties in the history matching led us to modify some choices in the geological, geostatistical and flow models. These difficulties show the importance of dynamic data integration in reservoir modeling. The petrophysical property assignment within the litho-types was done by using well test data. We used an inversion procedure to evaluate the petrophysical parameters of the litho-types. The up-scaling is a necessary stage to reduce the flow simulation time. We compared several up-scaling methods and we show that the passage from the fine geostatistical model to the coarse flow model should be done very carefully. The choice of the fitting parameter depends on the objective of the study. In the case of the reservoir PBR, where water is injected in order to improve the oil recovery, the water rate of the producing wells is directly related to the reservoir heterogeneity. Thus, the water rate was chosen as the fitting parameter. We obtained significant improvements in the history matching of the reservoir PBR. First, by using a method we have proposed, called patchwork. This method allows us to built a coherent
An Analysis of the Economy Principle Applied in Cyber Language
Institute of Scientific and Technical Information of China (English)
肖钰敏
2015-01-01
With the development of network technology,cyber language,a new social dialect,is widely used in our life.The author analyzes how the economy principle is applied in cyber language from three aspects—word-formation,syntax and non-linguistic symbol.And the author collects,summarizes and analyzes the relevant language materials to prove the economy principle’s real existence in chat room and the reason why the economy principle is applied widely in cyber space.
Applied Missing Data Analysis. Methodology in the Social Sciences Series
Enders, Craig K.
2010-01-01
Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…
An applied general equilibrium model for Dutch agribusiness policy analysis.
Peerlings, J.H.M.
1993-01-01
The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly
System Analysis Applying to Talent Resource Development Research
Institute of Scientific and Technical Information of China (English)
WANG Peng-tao; ZHENG Gang
2001-01-01
In the development research of talent resource, the most important of talent resource forecast and optimization is the structure of talent resource, requirement number and talent quality. The article establish factor reconstruction analysis forecast and talent quality model on the method: system reconstruction analysis and ensure most effective factor level in system, which is presented by G. J. Klirti, B.Jonesque. And performing dynamic analysis of example ration.
Analysis of OFDM Applied to Powerline High Speed Digital Communication
Institute of Scientific and Technical Information of China (English)
ZHUANG Jian; YANG Gong-xu
2003-01-01
The low voltage powerline is becoming a powerful solution to home network, building automation, and internet access as a result of its wide distribution, easy access and little maintenance. The character of powerline channel is very complicated because it is an open net. This article analysed the character of the powerline channel,introduced the basics of OFDM(Orthogonal Frequency Division Multiplexing), and studied the OFDM applied into powerline high speed digital communication.
Geostatistical Evaluation of Spring Water Quality in an Urbanizing Carbonate Aquifer
McGinty, A.; Welty, C.
2003-04-01
As part of an investigation of the impacts of urbanization on the hydrology and ecology of Valley Creek watershed near Philadelphia, Pennsylvania, we have analyzed the chemical composition of 110 springs to assess the relative influence of geology and anthropogenic activities on water quality. The 60 km^2 watershed is underlain by productive fractured rock aquifers composed of Cambrian and Ordovician carbonate rocks in the central valley and Cambrian crystalline and siliciclastic rocks (quartzite and phyllite) in the north and south hills that border the valley. All tributaries of the surface water system originate in the crystalline and siliciclastic hills. The watershed is covered by 17% impervious area and contains 6 major hazardous waste sites, one active quarrying operation and one golf course; 25% of the area utilizes septic systems for sewage disposal. We identified 172 springs, 110 of which had measurable flow rates ranging from 0.002 to 5 l/s. The mapped surficial geology appears as an anisotropic pattern, with long bands of rock formations paralleling the geographic orientation of the valley. Mapped development appears as a more isotropic pattern, characterized by isolated patches of land use that are not coincident with the evident geologic pattern. Superimposed upon these characteristics is a dense array of depressions and shallow sinkholes in the carbonate rocks, and a system of major faults at several formation contacts. We used indicator geostatistics to quantitatively characterize the spatial extent of the major geologic formations and patterns of land use. Maximum correlation scales for the rock types corresponded with strike direction and ranged from 1000 to 3000 m. Anisotropy ratios ranged from 2 to 4. Land-use correlation scales were generally smaller (200 to 500 m) with anisotropy ratios of around 1.2, i.e., nearly isotropic as predicted. Geostatistical analysis of spring water quality parameters related to geology (pH, specific conductance
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Factorial kriging analysis applied to geological data from petroleum exploration
Energy Technology Data Exchange (ETDEWEB)
Jaquet, O.
1989-10-01
A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.
Applying Galois compliance for data analysis in information systems
Directory of Open Access Journals (Sweden)
Kozlov Sergey
2016-03-01
Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.
Signed directed social network analysis applied to group conflict
DEFF Research Database (Denmark)
Zheng, Quan; Skillicorn, David; Walther, Olivier
2015-01-01
Real-world social networks contain relationships of multiple different types, but this richness is often ignored in graph-theoretic modelling. We show how two recently developed spectral embedding techniques, for directed graphs (relationships are asymmetric) and for signed graphs (relationships...... are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...
Applied network security monitoring collection, detection, and analysis
Sanders, Chris
2013-01-01
Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di
Joint regression analysis and AMMI model applied to oat improvement
Oliveira, A.; Oliveira, T. A.; Mejza, S.
2012-09-01
In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.
Structured Analysis and Supervision Applied on Heavy Fuel Oil Tanks
Directory of Open Access Journals (Sweden)
LAKHOUA Mohamed Najeh
2016-05-01
Full Text Available This paper introduces the need for structured analysis and real time (SA-RT method of controlcommand applications in a thermal power plant (TPP using a supervisory control and data acquisition system (SCADA. Then, the architecture of a SCADA system in a TPP is presented. A significant example of a control-command application is presented. It is about the heavy fuel oil tanks of a TPP. Then an application of a structured analysis method, generally used in industry, on the basis of the SA-RT formalism is presented. In fact, different modules are represented and described: Context Diagram, Data Flows Diagram, Control Flows Diagram, State Transition Diagram, Timing Specifications and Requirements Dictionary. Finally, this functional and operational analysis allows us to assist the different steps of the specification, the programming and the configuration of a new tabular in a SCADA system.
Systems design analysis applied to launch vehicle configuration
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Applying Content Analysis to Web-based Content
Kim, Inhwa; Kuljis, Jasna
2010-01-01
Using Content Analysis onWeb-based content, in particular the content available onWeb 2.0 sites, is investigated. The relative strengths and limitations of the method are described. To illustrate how content analysis may be used, we provide a brief overview of a case study that investigates cultural impacts on the use of design features with regard to self-disclosure on the blogs of South Korean and United Kingdom’s users. In this study we took a standard approach to conducting the content an...
Applying Adult Learning Theory through a Character Analysis
Baskas, Richard S.
2011-01-01
The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…
Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia
Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann
2011-01-01
Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…
Applying an Activity System to Online Collaborative Group Work Analysis
Choi, Hyungshin; Kang, Myunghee
2010-01-01
This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…
Action, Content and Identity in Applied Genre Analysis for ESP
Flowerdew, John
2011-01-01
Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…
Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J
2016-08-15
In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their
Structural dynamic responses analysis applying differential quadrature method
Institute of Scientific and Technical Information of China (English)
PU Jun-ping; ZHENG Jian-jun
2006-01-01
Unconditionally stable higher-order accurate time step integration algorithms based on the differential quadrature method (DQM) for second-order initial value problems were applied and the quadrature rules of DQM, computing of the weighting coefficients and choices of sampling grid points were discussed. Some numerical examples dealing with the heat transfer problem, the second-order differential equation of imposed vibration of linear single-degree-of-freedom systems and double-degree-of-freedom systems, the nonlinear move differential equation and a beam forced by a changing load were computed,respectively. The results indicated that the algorithm can produce highly accurate solutions with minimal time consumption, and that the system total energy can remain conservative in the numerical computation.
Thermal Analysis Applied to Verapamil Hydrochloride Characterization in Pharmaceutical Formulations
Directory of Open Access Journals (Sweden)
Maria Irene Yoshida
2010-04-01
Full Text Available Thermogravimetry (TG and differential scanning calorimetry (DSC are useful techniques that have been successfully applied in the pharmaceutical industry to reveal important information regarding the physicochemical properties of drug and excipient molecules such as polymorphism, stability, purity, formulation compatibility among others. Verapamil hydrochloride shows thermal stability up to 180 °C and melts at 146 °C, followed by total degradation. The drug is compatible with all the excipients evaluated. The drug showed degradation when subjected to oxidizing conditions, suggesting that the degradation product is 3,4-dimethoxybenzoic acid derived from alkyl side chain oxidation. Verapamil hydrochloride does not present the phenomenon of polymorphism under the conditions evaluated. Assessing the drug degradation kinetics, the drug had a shelf life (t90 of 56.7 years and a pharmaceutical formulation showed t90 of 6.8 years showing their high stability.
Geostatistical inference using crosshole ground-penetrating radar
DEFF Research Database (Denmark)
Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou;
2010-01-01
of the subsurface are used to evaluate the uncertainty of the inversion estimate. We have explored the full potential of the geostatistical inference method using several synthetic models of varying correlation structures and have tested the influence of different assumptions concerning the choice of covariance...... function and data noise level. In addition, we have tested the methodology on traveltime data collected at a field site in Denmark. There, inferred correlation structures indicate that structural differences exist between two areas located approximately 10 m apart, an observation confirmed by a GPR...
A Spatial Lattice Model Applied for Meteorological Visualization and Analysis
Directory of Open Access Journals (Sweden)
Mingyue Lu
2017-03-01
Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases
Institute of Scientific and Technical Information of China (English)
魏义长; 赵东保; 李小根; 张富; 杨成杰; 姚志宏
2011-01-01
地统计分析是空间分析的重要技术手段之一,当前不少地理信息系统(GIS)专业的学生对地统计分析课程不够了解和重视.为促进GIS专业学生对地统计分析课程的认识,推动地统计学的发展,作者根据多年从事地统计分析教学与科研的经验,并通过查阅大量国内外文献,从地统计分析与传统统计学,地理信息系统的区别与联系出发,详细地分析了地统计分析课程在地理信息系统教学中的地位和作用,深入探讨了学习地统计分析课程的方法,最后对地统计分析课程的发展进行了展望.%Ceostatistical analysis is one of very important technical methods in spatial analysis, but many students majoring in geographic information system (CIS) did not attach importance to learning the theory and technique of geostatistics,even more a few students know nothing of it. Therefore,according to the experience of the authors in teaching and researching by using geostatistics,and by consulting all literatures on geostatistics, the paper first discussed the difference and relationship of geostatistics with classical statistics and GIS,and analyzed the status and role of geostatistics in the education of geographic information system,and then investigated the methods of learning geostatistics, finally, took a glance into the future of geostatistics. So that, the authors expect the paper should increase the recognizing to geostatistics in students majoring in GIS, and advance the developing of geostatistics.
LAMQS analysis applied to ancient Egyptian bronze coins
Energy Technology Data Exchange (ETDEWEB)
Torrisi, L., E-mail: lorenzo.torrisi@unime.i [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caridi, F.; Giuffrida, L.; Torrisi, A. [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Mondio, G.; Serafino, T. [Dipartimento di Fisica della Materia ed Ingegneria Elettronica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caltabiano, M.; Castrizio, E.D. [Dipartimento di Lettere e Filosofia dell' Universita di Messina, Polo Universitario dell' Annunziata, 98168 Messina (Italy); Paniz, E.; Salici, A. [Carabinieri, Reparto Investigazioni Scientifiche, S.S. 114, Km. 6, 400 Tremestieri, Messina (Italy)
2010-05-15
Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.
Current Human Reliability Analysis Methods Applied to Computerized Procedures
Energy Technology Data Exchange (ETDEWEB)
Ronald L. Boring
2012-06-01
Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.
Ion beam analysis techniques applied to large scale pollution studies
Energy Technology Data Exchange (ETDEWEB)
Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)
1993-12-31
Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.
Applying Cognitive Work Analysis to Time Critical Targeting Functionality
2004-10-01
Target List/Dynamic Target Queue (DTL/ DTQ ) in the same place. Figure 4-27 shows the task steps involved in achieving Goal 7. 4- 30 Figure 4-27...GUI WG to brainstorm the order of columns in the DTL/ DTQ Table, a critical component of the TCTF CUI, with successful results, which were...Cognitive Work Analysis DTD Display Task Description DTL/ DTQ Dynamic Target List/Dynamic Target Queue FDO Fighter Duty Officer FEBA Forward Edge
Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis
Árpád Gyéresi; Eleonora Mircia; Brigitta Simon; Aura Rusu; Gabriel Hancu
2013-01-01
Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve...
Improving Credit Scorecard Modeling Through Applying Text Analysis
Directory of Open Access Journals (Sweden)
Omar Ghailan
2016-04-01
Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.
Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.
2015-01-01
In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.
An assessment of gas emanation hazard using a geographic information system and geostatistics.
Astorri, F; Beaubien, S E; Ciotoli, G; Lombardi, S
2002-03-01
This paper describes the use of geostatistical analysis and GIS techniques to assess gas emanation hazards. The Mt. Vulsini volcanic district was selected for this study because of the wide range of natural phenomena locally present that affect gas migration in the near surface. In addition, soil gas samples that were collected in this area should allow for a calibration between the generated risk/hazard models and the measured distribution of toxic gas species at surface. The approach used during this study consisted of three general stages. First data were digitally organized into thematic layers, then software functions in the GIS program "ArcView" were used to compare and correlate these various layers, and then finally the produced "potential-risk" map was compared with radon soil gas data in order to validate the model and/or to select zones for further, more-detailed soil gas investigations.
Xie, Zheng-miao; Li, Jing; Wang, Bi-ling; Chen, Jian-jun
2006-10-01
Contents of heavy metals (Pb, Zn, Cd, Cu) in soils and vegetables from Dongguan town in Shangyu city, China were studied using geostatistical analysis and GIS technique to evaluate environmental quality. Based on the evaluation criteria, the distribution of the spatial variability of heavy metals in soil-vegetable system was mapped and analyzed. The results showed that the distribution of soil heavy metals in a large number of soil samples in Dongguan town was asymmetric. The contents of Zn and Cu were lower than those of Cd and Pb. The concentrations distribution of Pb, Zn, Cd and Cu in soils and vegetables were different in spatial variability. There was a close relationship between total and available contents of heavy metals in soil. The contents of Pb and Cd in green vegetables were higher than those of Zn and Cu and exceeded the national sanitation standards for vegetables.
Introduction to this Special Issue on Geostatistics and Scaling of Remote Sensing
Quattrochi, Dale A.
1999-01-01
The germination of this special PE&RS issue began at the Royal Geographical Society (with the Institute of British Geographers)(RCS-IBC) annual meeting in January, 1997 held at the University of Exeter in Exeter, England. The cold and snow of an England winter were greatly tempered by the friendly and cordial discussions that ensued at the meeting on possible ways to foster both dialog and research across "the Big Pond" between geographers in the US and the UK on the use of geostatistics and geospatial techniques for remote sensing of land surface processes. It was decided that one way to stimulate and enhance cooperation on the application of geostatistics and geospatial methods in remote sensing was to hold parallel sessions on these topics at appropriate meeting venues in 1998 in both the US and the UK Selected papers given at these sessions would be published as a special issue of PE&RS on the US side, and as a special issue of Computers and Geosciences (C&G) on the UK side, to highlight the commonality in research on geostatistics and geospatial methods in remote sensing and spatial data analysis on both sides of the Atlantic Ocean. As a consequence, a session on "Ceostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March, 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). A similar session was held at the RGS-IBG annual meeting in Guildford, Surrey, England in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). The six papers that in part, comprise this issue of PE&RS, are the US complement to such a dual journal publication effort. Both of us are co-editors of each of the journal special issues, with the lead editor of each journal being from their respective side of the Atlantic where the journals are published. The special
A Multifactorial Analysis of Reconstruction Methods Applied After Total Gastrectomy
Directory of Open Access Journals (Sweden)
Oktay Büyükaşık
2010-12-01
Full Text Available Aim: The aim of this study was to evaluate the reconstruction methods applied after total gastrectomy in terms of postoperative symptomology and nutrition. Methods: This retrospective study was conducted on 31 patients who underwent total gastrectomy due to gastric cancer in 2. Clinic of General Surgery, SSK Ankara Training Hospital. 6 different reconstruction methods were used and analyzed in terms of age, sex and postoperative complications. One from esophagus and two biopsy specimens from jejunum were taken through upper gastrointestinal endoscopy from all cases, and late period morphological and microbiological changes were examined. Postoperative weight change, dumping symptoms, reflux esophagitis, solid/liquid dysphagia, early satiety, postprandial pain, diarrhea and anorexia were assessed. Results: Of 31 patients,18 were males and 13 females; the youngest one was 33 years old, while the oldest- 69 years old. It was found that reconstruction without pouch was performed in 22 cases and with pouch in 9 cases. Early satiety, postprandial pain, dumping symptoms, diarrhea and anemia were found most commonly in cases with reconstruction without pouch. The rate of bacterial colonization of the jejunal mucosa was identical in both groups. Reflux esophagitis was most commonly seen in omega esophagojejunostomy (EJ, while the least-in Roux-en-Y, Tooley and Tanner 19 EJ. Conclusion: Reconstruction with pouch performed after total gastrectomy is still a preferable method. (The Medical Bulletin of Haseki 2010; 48:126-31
Applying temporal network analysis to the venture capital market
Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene
2015-10-01
Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.
Image analysis technique applied to lock-exchange gravity currents
Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario
2013-01-01
An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...
Methods of analysis applied on the e-shop Arsta
Flégl, Jan
2013-01-01
Bachelor thesis is focused on summarizing methods of e-shop analysis. The first chapter summarizes and describes the basics of e-commerce and e-shops in general. The second chapter deals with search engines, their functioning and in what ways it is possible to influence the order of search results. Special attention is paid to the optimization and search engine marketing. The third chapter summarizes basic tools of the Google Analytics. The fourth chapter uses findings of all the previous cha...
Dynamical Systems Analysis Applied to Working Memory Data
Directory of Open Access Journals (Sweden)
Fidan eGasimova
2014-07-01
Full Text Available In the present paper we investigate weekly fluctuations in the working memory capacity (WMC assessed over a period of two years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure’s performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.
Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis
Directory of Open Access Journals (Sweden)
Árpád Gyéresi
2013-02-01
Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.
Principles of micellar electrokinetic capillary chromatography applied in pharmaceutical analysis.
Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Arpád
2013-01-01
Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.
Operational modal analysis applied to the concert harp
Chomette, B.; Le Carrou, J.-L.
2015-05-01
Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.
Sensitivity Analysis Applied in Design of Low Energy Office Building
DEFF Research Database (Denmark)
Heiselberg, Per; Brohus, Henrik
2008-01-01
Building performance can be expressed by different indicators as primary energy use, environmental load and/or the indoor environmental quality and a building performance simulation can provide the decision maker with a quantitative measure of the extent to which an integrated design solution...... satisfies the design requirements and objectives. In the design of sustainable Buildings it is beneficial to identify the most important design parameters in order to develop more efficiently alternative design solutions or reach optimized design solutions. A sensitivity analysis makes it possible...... to identify the most important parameters in relation to building performance and to focus design and optimization of sustainable buildings on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage of the building design process, where...
Neutron activation analysis applied to nutritional and foodstuff studies
Energy Technology Data Exchange (ETDEWEB)
Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de, E-mail: vmaihara@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Avegliano, Roseane P., E-mail: pagliaro@usp.b [Universidade de Sao Paulo (USP), SP (Brazil). Coordenadoria de Assistencia Social. Div. de Alimentacao
2009-07-01
Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)
Wavelets Applied to CMB Maps a Multiresolution Analysis for Denoising
Sanz, J L; Cayon, L; Martínez-González, E; Barriero, R B; Toffolatti, L
1999-01-01
Analysis and denoising of Cosmic Microwave Background (CMB) maps are performed using wavelet multiresolution techniques. The method is tested on $12^{\\circ}.8\\times 12^{\\circ}.8$ maps with resolution resembling the experimental one expected for future high resolution space observations. Semianalytic formulae of the variance of wavelet coefficients are given for the Haar and Mexican Hat wavelet bases. Results are presented for the standard Cold Dark Matter (CDM) model. Denoising of simulated maps is carried out by removal of wavelet coefficients dominated by instrumental noise. CMB maps with a signal-to-noise, $S/N \\sim 1$, are denoised with an error improvement factor between 3 and 5. Moreover we have also tested how well the CMB temperature power spectrum is recovered after denoising. We are able to reconstruct the $C_{\\ell}$'s up to $l\\sim 1500$ with errors always below $20% $ in cases with $S/N \\ge 1$.
Downside Risk analysis applied to the Hedge Funds universe
Perelló, Josep
2007-09-01
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.
Applying importance-performance analysis to evaluate banking service quality
Directory of Open Access Journals (Sweden)
André Luís Policani Freitas
2012-11-01
Full Text Available In an increasingly competitive market, the identification of the most important aspects and the measurement of service quality as perceived by the customers are important actions taken by organizations which seek the competitive advantage. In particular, this scenario is typical of Brazilian banking sector. In this context, this article presents an exploratory case study in which the Importance-Performance Analysis (IPA was used to identify the strong and the weak points related to services provided by a bank. In order to check the reliability of the questionnaire, Cronbach's alpha and correlation analyses were used. The results are presented and some actions have been defined in order to improve the quality of services.
Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.
2010-09-01
There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at
Regional flow duration curves: Geostatistical techniques versus multivariate regression
Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.
2016-01-01
A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.
Regional flow duration curves: Geostatistical techniques versus multivariate regression
Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.
2016-10-01
A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.
Applying data mining for the analysis of breast cancer data.
Liou, Der-Ming; Chang, Wei-Pin
2015-01-01
Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of
Karami, Hossein
2015-01-01
Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…
Cross-covariance functions for multivariate geostatistics
Genton, Marc G.
2015-05-01
Continuously indexed datasets with multiple variables have become ubiquitous in the geophysical, ecological, environmental and climate sciences, and pose substantial analysis challenges to scientists and statisticians. For many years, scientists developed models that aimed at capturing the spatial behavior for an individual process; only within the last few decades has it become commonplace to model multiple processes jointly. The key difficulty is in specifying the cross-covariance function, that is, the function responsible for the relationship between distinct variables. Indeed, these cross-covariance functions must be chosen to be consistent with marginal covariance functions in such a way that the second-order structure always yields a nonnegative definite covariance matrix. We review the main approaches to building cross-covariance models, including the linear model of coregionalization, convolution methods, the multivariate Matérn and nonstationary and space-time extensions of these among others. We additionally cover specialized constructions, including those designed for asymmetry, compact support and spherical domains, with a review of physics-constrained models. We illustrate select models on a bivariate regional climate model output example for temperature and pressure, along with a bivariate minimum and maximum temperature observational dataset; we compare models by likelihood value as well as via cross-validation co-kriging studies. The article closes with a discussion of unsolved problems. © Institute of Mathematical Statistics, 2015.
Perturbation Method of Analysis Applied to Substitution Measurements of Buckling
Energy Technology Data Exchange (ETDEWEB)
Persson, Rolf
1966-11-15
Calculations with two-group perturbation theory on substitution experiments with homogenized regions show that a condensation of the results into a one-group formula is possible, provided that a transition region is introduced in a proper way. In heterogeneous cores the transition region comes in as a consequence of a new cell concept. By making use of progressive substitutions the properties of the transition region can be regarded as fitting parameters in the evaluation procedure. The thickness of the region is approximately equal to the sum of 1/(1/{tau} + 1/L{sup 2}){sup 1/2} for the test and reference regions. Consequently a region where L{sup 2} >> {tau}, e.g. D{sub 2}O, contributes with {radical}{tau} to the thickness. In cores where {tau} >> L{sup 2} , e.g. H{sub 2}O assemblies, the thickness of the transition region is determined by L. Experiments on rod lattices in D{sub 2}O and on test regions of D{sub 2}O alone (where B{sup 2} = - 1/L{sup 2} ) are analysed. The lattice measurements, where the pitches differed by a factor of {radical}2, gave excellent results, whereas the determination of the diffusion length in D{sub 2}O by this method was not quite successful. Even regions containing only one test element can be used in a meaningful way in the analysis.
Application of geostatistical inversion to thin reservoir prediction%地质统计学反演技术在薄储层预测中的应用
Institute of Scientific and Technical Information of China (English)
王香文; 刘红; 滕彬彬; 王连雨
2012-01-01
Taking Ml thin reservoir in H-N oilfield,Southern Ecuador,as an example,this paper documents the challenges and problems of thin reservoir prediction and presents relevant techniques and methods to tackle these problems. Based on analysis of geophysical characteristics of reservoirs and surrounding rocks,a geostatistical inversion technique is applied in this case to identify the thin(l -25ft) reservoirs with rapid lateral changes and strong concealment. Sand distribution is refined through correlation between different data volume including seismic interpretation, CSSI( Constrained Sparse Spike Inversion) and geostatistical inversion,and is further checked by non-well, random-wells and newly drilled wells. The accuracy of thin reservoir prediction is greatly enhanced to a vertical resolution up to 5ft. This technique is successfully applied in H-N oilfield and the new drilling data show that all the prediceted thin sand layers are encountered and the drilling coincidence rate is 82%.%以厄瓜多尔南部H-N油田M1层薄储层为例,阐述了研究区M1层储层预测难点和存在问题,提出针对性的储层预测方法技术.经过储层和围岩地球物理特征分析,论证了储层预测条件,确定了运用以地质统计学反演为核心的储层预测技术对该区进行储层预测研究,来解决该区储层薄(1 ～25 ft)、横向变化大、隐蔽性强的薄储层的识别；通过以地震、稀疏脉冲反演、地质统计学反演不同数据体间砂体进行对比分析,精细解释出该区砂体分布；经过无井、盲井和新钻井校验,实现了薄层的高精度预测,提高了预测精度(垂向分辨率达到5ft).该预测结果经过在H-N油田的实际应用和新钻井钻探证实,砂层钻遇率为100％,钻探符合率达82％,实现了该区新井产能的突破.
Improving the flash flood frequency analysis applying dendrogeomorphological evidences
Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.
2009-09-01
Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait
Geostatistical three-dimensional modeling of oolite shoals, St. Louis Limestone, southwest Kansas
Qi, L.; Carr, T.R.; Goldstein, R.H.
2007-01-01
In the Hugoton embayment of southwestern Kansas, reservoirs composed of relatively thin (oil. The geometry and distribution of oolitic deposits control the heterogeneity of the reservoirs, resulting in exploration challenges and relatively low recovery. Geostatistical three-dimensional (3-D) models were constructed to quantify the geometry and spatial distribution of oolitic reservoirs, and the continuity of flow units within Big Bow and Sand Arroyo Creek fields. Lithofacies in uncored wells were predicted from digital logs using a neural network. The tilting effect from the Laramide orogeny was removed to construct restored structural surfaces at the time of deposition. Well data and structural maps were integrated to build 3-D models of oolitic reservoirs using stochastic simulations with geometry data. Three-dimensional models provide insights into the distribution, the external and internal geometry of oolitic deposits, and the sedimentologic processes that generated reservoir intervals. The structural highs and general structural trend had a significant impact on the distribution and orientation of the oolitic complexes. The depositional pattern and connectivity analysis suggest an overall aggradation of shallow-marine deposits during pulses of relative sea level rise followed by deepening near the top of the St. Louis Limestone. Cemented oolitic deposits were modeled as barriers and baffles and tend to concentrate at the edge of oolitic complexes. Spatial distribution of porous oolitic deposits controls the internal geometry of rock properties. Integrated geostatistical modeling methods can be applicable to other complex carbonate or siliciclastic reservoirs in shallow-marine settings. Copyright ?? 2007. The American Association of Petroleum Geologists. All rights reserved.
Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano
2015-01-01
The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects.
Geostatistical prediction of flow-duration curves in an index-flow framework
Pugliese, A.; Castellarin, A.; Brath, A.
2014-09-01
An empirical period-of-record flow-duration curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. In many practical applications one has to construct FDCs in basins that are ungauged or where very few observations are available. We present an application strategy of top-kriging, which makes the geostatistical procedure capable of predicting FDCs in ungauged catchments. Previous applications of top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.); here the procedure is used to predict the entire curve in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. In particular, we propose to standardise empirical FDCs by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation × the drainage area) and to compute the overall negative deviation of the curves from this reference value. We then propose to use these values, which we term total negative deviation (TND), for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We focus on the prediction of FDCs for 18 unregulated catchments located in central Italy, and we quantify the accuracy of the proposed technique under various operational conditions through an extensive cross-validation and sensitivity analysis. The cross-validation points out that top-kriging is a reliable approach for predicting FDCs with Nash-Sutcliffe efficiency measures ranging from 0.85 to 0.96 (depending on the model settings) very low biases over the entire duration range, and an enhanced representation of the low-flow regime relative to other regionalisation models that were recently developed for the same study region.
Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism
Boutot, E. Amanda; Hume, Kara
2012-01-01
Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…
Ciotoli, G; Voltaggio, M; Tuccimei, P; Soligo, M; Pasculli, A; Beaubien, S E; Bigi, S
2017-01-01
In many countries, assessment programmes are carried out to identify areas where people may be exposed to high radon levels. These programmes often involve detailed mapping, followed by spatial interpolation and extrapolation of the results based on the correlation of indoor radon values with other parameters (e.g., lithology, permeability and airborne total gamma radiation) to optimise the radon hazard maps at the municipal and/or regional scale. In the present work, Geographical Weighted Regression and geostatistics are used to estimate the Geogenic Radon Potential (GRP) of the Lazio Region, assuming that the radon risk only depends on the geological and environmental characteristics of the study area. A wide geodatabase has been organised including about 8000 samples of soil-gas radon, as well as other proxy variables, such as radium and uranium content of homogeneous geological units, rock permeability, and faults and topography often associated with radon production/migration in the shallow environment. All these data have been processed in a Geographic Information System (GIS) using geospatial analysis and geostatistics to produce base thematic maps in a 1000 m × 1000 m grid format. Global Ordinary Least Squared (OLS) regression and local Geographical Weighted Regression (GWR) have been applied and compared assuming that the relationships between radon activities and the environmental variables are not spatially stationary, but vary locally according to the GRP. The spatial regression model has been elaborated considering soil-gas radon concentrations as the response variable and developing proxy variables as predictors through the use of a training dataset. Then a validation procedure was used to predict soil-gas radon values using a test dataset. Finally, the predicted values were interpolated using the kriging algorithm to obtain the GRP map of the Lazio region. The map shows some high GRP areas corresponding to the volcanic terrains (central
Directory of Open Access Journals (Sweden)
Melissa Oda-Souza
2008-06-01
arrangement (non-randomized of the plants and the high sensibility for missing values. The aim of this work was to describe the geostatistic model and associated methods of inference in the analysis context of non-randomized experiment, reporting applied results to identify the spatial dependence in a fan systematic design of Eucalyptus dunnii. Furthermore, different alternatives for treating missing values that can occur from flaws and/or mortality of plants were proposed, analyzed and compared. Data were analyzed by three models that differed, with covariates, in the form of modeling missing data values. A semivariogram was built for each model, adjusting three correlation function models, being the parameters estimated through the maximum likelihood method and selected by the Akaike's criterion. These models, with and without the spatial component, were compared by the likelihood ratio test. The results showed that: (1 the covariates interacted positively with the response variable, avoiding data to be discarded; (2 the model comparison, with and without the spatial component, did not confirm the existence of dependence; (3 the incorporation of the spatial dependence structure into the observational models recovered the capacity to make valid inferences in the absence of randomization, overcoming operational problems and guaranteeing that the data can be subjected to classic analysis.
Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming
2008-12-01
One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.
Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data
Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.
2015-12-01
For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.
Shiavi, Richard
2007-01-01
Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical
Łopata, Michał; Popielarczyk, Dariusz; Templin, Tomasz; Dunalska, Julita; Wiśniewski, Grzegorz; Bigaj, Izabela; Szymański, Daniel
2014-01-01
We investigated changes in the spatial distribution of phosphorus (P) and nitrogen (N) in the deep, mesotrophic Lake Hańcza. The raw data collection, supported by global navigation satellite system (GNSS) positioning, was conducted on 79 sampling points. A geostatistical method (kriging) was applied in spatial interpolation. Despite the relatively small area of the lake (3.04 km(2)), compact shape (shore development index of 2.04) and low horizontal exchange of water (retention time 11.4 years), chemical gradients in the surface waters were found. The largest variation concerns the main biogenic element - phosphorus. The average value was 0.032 at the extreme values of 0.019 to 0.265 mg L(-1) (coefficient of variation 87%). Smaller differences are related to nitrogen compounds (0.452-1.424 mg L(-1) with an average value of 0.583 mg L(-1), the coefficient of variation 20%). The parts of the lake which are fed with tributaries are the richest in phosphorus. The water quality of the oligo-mesotrophic Lake Hańcza has been deteriorating in recent years. Our results indicate that inferences about trends in the evolution of examined lake trophic status should be based on an analysis of the data, taking into account the local variation in water chemistry.
2015-08-13
Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis . The views, opinions and/or findings...of Papers published in non peer-reviewed journals: Final Report: Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and...Applications of Stochastic Analysis . Report Title Research partially supported by this grant culminated in the submission of twenty eight new research papers
Bayesian geostatistics in health cartography: the perspective of malaria.
Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I
2011-06-01
Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.
A Geostatistical Approach to Indoor Surface Sampling Strategies
DEFF Research Database (Denmark)
Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg
1990-01-01
framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface......Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...
Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density EEG
Wingeier, Brett M.; Nunez, Paul L.; Silberstein, Richard B.
2000-01-01
We demonstrate an application of spherical harmonic decomposition to analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to analysis of hemispherical, irregularly sampled data. Performance of the methods and spatial sampling requirements are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wavenumber relationship in some bands.
Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density EEG
Wingeier, B M; Silberstein, R B; Wingeier, Brett M.; Nunez, Paul L.; Silberstein, Richard B.
2001-01-01
We demonstrate an application of spherical harmonic decomposition to analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to analysis of hemispherical, irregularly sampled data. Performance of the methods and spatial sampling requirements are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wavenumber relationship in some bands.
Directory of Open Access Journals (Sweden)
S. Ly
2011-07-01
Full Text Available Spatial interpolation of precipitation data is of great importance for hydrological modelling. Geostatistical methods (kriging are widely applied in spatial interpolation from point measurement to continuous surfaces. The first step in kriging computation is the semi-variogram modelling which usually used only one variogram model for all-moment data. The objective of this paper was to develop different algorithms of spatial interpolation for daily rainfall on 1 km^{2} regular grids in the catchment area and to compare the results of geostatistical and deterministic approaches. This study leaned on 30-yr daily rainfall data of 70 raingages in the hilly landscape of the Ourthe and Ambleve catchments in Belgium (2908 km^{2}. This area lies between 35 and 693 m in elevation and consists of river networks, which are tributaries of the Meuse River. For geostatistical algorithms, seven semi-variogram models (logarithmic, power, exponential, Gaussian, rational quadratic, spherical and penta-spherical were fitted to daily sample semi-variogram on a daily basis. These seven variogram models were also adopted to avoid negative interpolated rainfall. The elevation, extracted from a digital elevation model, was incorporated into multivariate geostatistics. Seven validation raingages and cross validation were used to compare the interpolation performance of these algorithms applied to different densities of raingages. We found that between the seven variogram models used, the Gaussian model was the most frequently best fit. Using seven variogram models can avoid negative daily rainfall in ordinary kriging. The negative estimates of kriging were observed for convective more than stratiform rain. The performance of the different methods varied slightly according to the density of raingages, particularly between 8 and 70 raingages but it was much different for interpolation using 4 raingages. Spatial interpolation with the geostatistical and
Energy Technology Data Exchange (ETDEWEB)
Cassiraga, E.F.; Gomez-Hernandez, J.J. [Departamento de Ingenieria Hidraulica y Medio Ambiente, Universidad Politecnica de Valencia, Valencia (Spain)
1996-10-01
The main objective of this report is to describe the different geostatistical techniques to use the geophysical and hydrological parameters. We analyze the characteristics of estimation methods used in others studies.
Model Proposition for the Fiscal Policies Analysis Applied in Economic Field
Directory of Open Access Journals (Sweden)
Larisa Preda
2007-05-01
Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.
Research in progress in applied mathematics, numerical analysis, and computer science
1990-01-01
Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.
Illman, Walter A.; Berg, Steven J.; Zhao, Zhanfeng
2015-05-01
The robust performance of hydraulic tomography (HT) based on geostatistics has been demonstrated through numerous synthetic, laboratory, and field studies. While geostatistical inverse methods offer many advantages, one key disadvantage is its highly parameterized nature, which renders it computationally intensive for large-scale problems. Another issue is that geostatistics-based HT may produce overly smooth images of subsurface heterogeneity when there are few monitoring interval data. Therefore, some may question the utility of the geostatistical inversion approach in certain situations and seek alternative approaches. To investigate these issues, we simultaneously calibrated different groundwater models with varying subsurface conceptualizations and parameter resolutions using a laboratory sandbox aquifer. The compared models included: (1) isotropic and anisotropic effective parameter models; (2) a heterogeneous model that faithfully represents the geological features; and (3) a heterogeneous model based on geostatistical inverse modeling. The performance of these models was assessed by quantitatively examining the results from model calibration and validation. Calibration data consisted of steady state drawdown data from eight pumping tests and validation data consisted of data from 16 separate pumping tests not used in the calibration effort. Results revealed that the geostatistical inversion approach performed the best among the approaches compared, although the geological model that faithfully represented stratigraphy came a close second. In addition, when the number of pumping tests available for inverse modeling was small, the geological modeling approach yielded more robust validation results. This suggests that better knowledge of stratigraphy obtained via geophysics or other means may contribute to improved results for HT.
Energy Technology Data Exchange (ETDEWEB)
Kolotilina, L.; Nikishin, A.; Yeremin, A. [and others
1994-12-31
The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.
Geostatistical analysis of GPS trajectory data: Space-time densities
Hengl, T.; van Loon, E.E.; Shamoun-Baranes, J.; Bouten, W.; Zhang, J.; Goodchild, M.F.
2008-01-01
Creation of density maps and estimation of home range is problematic for observations of animal movement at irregular intervals. We propose a technique to estimate space-time densities by separately modeling animal movement paths and velocities, both as continuous fields. First the length of traject
Moradkhani, Hamid; Yan, Hongxiang
2016-04-01
Soil moisture simulation and prediction are increasingly used to characterize agricultural droughts but the process suffers from data scarcity and quality. The satellite soil moisture observations could be used to improve model predictions with data assimilation. Remote sensing products, however, are typically discontinuous in spatial-temporal coverages; while simulated soil moisture products are potentially biased due to the errors in forcing data, parameters, and deficiencies of model physics. This study attempts to provide a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a fully distributed hydrologic model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. A geostatistical model is introduced to overcome the satellite soil moisture discontinuity issue where satellite data does not cover the whole study region or is significantly biased, and the dominant land cover is dense vegetation. The results indicate that joint assimilation of soil moisture and streamflow has minimal effect in improving the streamflow prediction, however, the surface soil moisture field is significantly improved. The combination of DA and geostatistical approach can further improve the surface soil moisture prediction.
Efficient Geostatistical Inversion under Transient Flow Conditions in Heterogeneous Porous Media
Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf
2014-05-01
a reasonable range. Transient inversion, however, requires time series of measurements and therefore typically leads to a large number of observations, and under these circumstances the existing methods become unfeasible. We present an extension of the existing inversion methods to instationary flow regimes. Our approach uses a Conjugate Gradients scheme preconditioned with the prior covariance matrix QY Y to avoid both multiplications with QY Y -1 and the explicit assembly of Hz. Instead, one combined adjoint model run is used for all observations at once. As the computing time of our approach is largely independent of the number of measurements used for inversion, the presented method can be applied to large data sets. This facilitates the treatment of applications with variable boundary conditions (nearby rivers, precipitation). We integrate the geostatistical inversion method into the software framework DUNE, enabling the use of high-performance-computing techniques and full parallelization. Feasibility of our approach is demonstrated through the joint inversion of several synthetic data sets in two and three dimensions, e.g. estimation of hydraulic conductivity using hydraulic head values and tracer concentrations, and scalability of the new method is analyzed. A comparison of the new method with existing geostatistical inversion approaches highlights its advantages and drawbacks and demonstrates scenarios in which our scheme can be beneficial.
The Geostatistical Framework for Spatial Prediction%空间预测的地统计学框架
Institute of Scientific and Technical Information of China (English)
张景雄; 姚娜
2008-01-01
Geostatistics provides a coherent framework for spatial prediction and uncertainty assessment, whereby spatial dependence, as quantified by variograms, is utilized for best linear unbiased estimation of a regionalized variable at unsampied locations. Geostatistics for prediction of continuous regionalized variables is reviewed, with key methods underlying the derivation of major variants of uni-variate Kriging described in an easy-to-follow manner. This paper will contribute to demystification and, hence, popularization of geostatistics in geoinformatics communities.
Matson, Johnny L.; Coe, David A.
1992-01-01
This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-01-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…
Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily
2009-01-01
Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…
Directory of Open Access Journals (Sweden)
Pawłowski Dominik
2014-12-01
Full Text Available Geostatistical methods for 2D and 3D modelling spatial variability of selected physicochemical properties of biogenic sediments were applied to a small valley mire in order to identify the processes that lead to the formation of various types of peat. A sequential Gaussian simulation was performed to reproduce the statistical distribution of the input data (pH and organic matter and their semivariances, as well as to honouring of data values, yielding more ‘realistic’ models that show microscale spatial variability, despite the fact that the input sample cores were sparsely distributed in the X-Y space of the study area. The stratigraphy of peat deposits in the Ldzań mire shows a record of long-term evolution of water conditions, which is associated with the variability in water supply over time. Ldzań is a fen (a rheotrophic mire with a through-flow of groundwater. Additionally, the vicinity of the Grabia River is marked by seasonal inundations of the southwest part of the mire and increased participation of mineral matter in the peat. In turn, the upper peat layers of some of the central part of Ldzań mire are rather spongy, and these peat-forming phytocoenoses probably formed during permanent waterlogging.
Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)
Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys
2016-02-01
Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).
Study on the spatial pattern of rainfall erosivity based on geostatistics in Hebei Province,China
Institute of Scientific and Technical Information of China (English)
Mingxin MEN; Zhenrong YU; Hao XU
2008-01-01
The objective of this article was to study the spatial distribution pattern of rainfall erosivity.The precipitation data at each climatological station in Hebei Province,China were collected and analyzed and modeled with SPSS and ArcGIS.A simple model of estimating rainfall erosivity was developed based on the weather station data.Also,the annual average rainfall erosivity was calculated with this model.The predicted errors,statistical feature values and prediction maps obtained by using different interpolation methods were compared.The result indicated that second-order ordinary Kriging method performed better than both zero and first-order ordinary Kriging methods.Within the method of second-order trend,Gaussian semi-variogram model performed better than other interpolation methods with the spherical or exponential models.Applying geostatistics to study rainfall erosivity spatial pattern will help to accurately and quantitatively evaluate soil erosion risk.Our research also provides digital maps that can assist in policy making in the regional soil and water conservation planning and management strategies.
A geostatistical approach to data harmonization - Application to radioactivity exposure data
Baume, O.; Skøien, J. O.; Heuvelink, G. B. M.; Pebesma, E. J.; Melles, S. J.
2011-06-01
Environmental issues such as air, groundwater pollution and climate change are frequently studied at spatial scales that cross boundaries between political and administrative regions. It is common for different administrations to employ different data collection methods. If these differences are not taken into account in spatial interpolation procedures then biases may appear and cause unrealistic results. The resulting maps may show misleading patterns and lead to wrong interpretations. Also, errors will propagate when these maps are used as input to environmental process models. In this paper we present and apply a geostatistical model that generalizes the universal kriging model such that it can handle heterogeneous data sources. The associated best linear unbiased estimation and prediction (BLUE and BLUP) equations are presented and it is shown that these lead to harmonized maps from which estimated biases are removed. The methodology is illustrated with an example of country bias removal in a radioactivity exposure assessment for four European countries. The application also addresses multicollinearity problems in data harmonization, which arise when both artificial bias factors and natural drifts are present and cannot easily be distinguished. Solutions for handling multicollinearity are suggested and directions for further investigations proposed.
Directory of Open Access Journals (Sweden)
Rina Agustina
2013-05-01
Full Text Available E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1 critical applied linguistics (CAL aspects and (2 critical discourse analysis (CDA. After analysing the website using CAL and CDA, it was found that the website was adequate for beginner, in which it provided fun learning through games as well as challenged learners’ to test their vocabulary. Despite of these strengths, there were several issues required further thinking in terms of learners’ broad knowledge, such as, some of learning materials focused on states in America. It was quite difficult for EFL learners if they did not have adequate general knowledge. Thus, the findings implied that the website could be used as a supporting learning material, which accompanied textbooks and vocabulary exercise books.
Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich
2015-01-01
Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded wi
岩石物理参数高分辨率地质统计学反演%High-resolution geostatistical petrophysical-parameter inversion
Institute of Scientific and Technical Information of China (English)
姜文龙; 杨锴
2012-01-01
Geostatistical inversion can well characterize thin-layers for its high-resolution. We discussed the relationship between geostatistical inversion and high-resolution, as well as the problem of geostatistical inversion in petrophysical parameter inversion. Moreover, the algorithm for reducing the uncertainty of inversion was studied. Research results show that along with the alternation of variogram, the resolution of geostatistical inversion result will change, but the conventional Krigging algorithm destroy the continuity of original geologic formations when improving resolution through reducing variogram. Based on the above results, we introduced some restraints such as geologic interpretation strata and dip into geostatistical inversion. The method was applied on the inversion of carbonate mineral components at ODP1144 station sea area in South Sea.%地质统计学反演由于其高分辨率的特点,可以很好地用来描述薄层等信息.就地质统计学反演与高分辨率的关系和地质统计学反演在岩石物理参数反演中存在的问题进行了讨论,并从算法上研究了减小反演不确定性的方法.研究结果表明,随着变差函数变程的改变,地质统计学模拟结果的分辨率也会发生改变,但常规的克里金算法在通过减小变程来提高分辨率的同时,破坏了原有地质层位的连续性.在此基础上提出加入地质解释层位和地层倾角等约束信息的地质统计学反演方法,将该方法应用于南海ODP1144站位海区矿物组分的反演,很好地揭示了该区碳酸盐矿物的沉积特征.
Caeiro, Sandra; Goovaerts, Pierre; Painho, Marco; Costa, M Helena
2003-09-15
The Sado Estuary is a coastal zone located in the south of Portugal where conflicts between conservation and development exist because of its location near industrialized urban zones and its designation as a natural reserve. The aim of this paper is to evaluate a set of multivariate geostatistical approaches to delineate spatially contiguous regions of sediment structure for Sado Estuary. These areas will be the supporting infrastructure of an environmental management system for this estuary. The boundaries of each homogeneous area were derived from three sediment characterization attributes through three different approaches: (1) cluster analysis of dissimilarity matrix function of geographical separation followed by indicator kriging of the cluster data, (2) discriminant analysis of kriged values of the three sediment attributes, and (3) a combination of methods 1 and 2. Final maximum likelihood classification was integrated into a geographical information system. All methods generated fairly spatially contiguous management areas that reproduce well the environment of the estuary. Map comparison techniques based on kappa statistics showed thatthe resultant three maps are similar, supporting the choice of any of the methods as appropriate for management of the Sado Estuary. However, the results of method 1 seem to be in better agreement with estuary behavior, assessment of contamination sources, and previous work conducted at this site.
He, Xingdong; Gao, Yubao; Zhao, Wenzhi; Cong, Zili
2004-09-01
Investigation results in the present study showed that plant communities took typical concentric circles distribution patterns along habitat gradient from top, slope to interdune on a few large fixed dunes in middle part of Korqin Sandy Land. In order to explain this phenomenon, analysis of water content and its spatial heterogeneity in sand layers on different locations of dunes was conducted. In these dunes, water contents in sand layers of the tops were lower than those of the slopes; both of them were lower than those of the interdunes. According to the results of geostatistics analysis, whether shifting dune or fixed dune, spatial heterogeneity of water contents in sand layers took on regular changes, such as ratios between nugget and sill and ranges reduced gradually, fractal dimension increased gradually, the regular changes of these parameters indicated that random spatial heterogeneity reduced gradually, and autocorrelation spatial heterogeneity increased gradually from the top, the slope to the interdune. The regular changes of water contents in sand layers and their spatial heterogeneity of different locations of the dunes, thus, might be an important cause resulted in the formation of the concentric circles patterns of the plant communities on these fixed dunes.
Institute of Scientific and Technical Information of China (English)
刘少军; 黄彦彬; 张京红; 李天富; 陈汇林; 陈德明
2006-01-01
虽然采用遥感图像提取的植被指数在空间上能较好的反映作物的状况,但其不能预测植被指数在空间上的变化范围,如果能从整体上了解不同市县在不同季节的平均植被指数值,就可以对该区域整体植被状态进行量化分析,也就可以从大范围内进行植被指数的预测分析.利用地理信息系统(GIS)和地统计学相结合的地理统计分析模块(ArcGIS Geostatistical Analyst),根据MODIS遥感数据提取的每季度不同市县平均NDVI植被指数,采用Kriging插值的方法分析了海南岛归一化植被指数(NDVI)季节性变化趋势,并与实际采样值进行对比分析,结果表明,利用ArcGIS Geostatistical Analyst中的Kriging插值方法能较好地预测植被指数的空间分布范围.
August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching
Joyce, Bonnie; Moxley, Roy A.
1988-01-01
August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the...
Sokolowsky, Martina; Fischer, Ulrich
2012-06-30
Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.
Geostatistical interpolation for modelling SPT data in northern Izmir
Indian Academy of Sciences (India)
Selim Altun; A Burak Göktepe; Alper Sezer
2013-12-01
In this study, it was aimed to map the corrected Standard Penetration Test(SPT) values in Karşıyaka city center by kriging approach. Six maps were prepared by this geostatistical approach at depths of 3, 6, 9, 13.5, 18 and 25.5m. Borehole test results obtained from 388 boreholes in central Karşıyaka were used to model the spatial variation of $(\\text{N}_1)_{\\text{60cs}}$ values in an area of 5.5 km2. Corrections were made for depth, hammer energy, rod length, sampler, borehole diameter and fines content, to the data in hand. At various depths, prepared variograms and the kriging method were used together to model the variation of corrected SPT data in the region, which enabled the estimation of missing data in the region. The results revealed that the estimation ability of the models were acceptable, which were validated by a number of parameters as well as the comparisons of the actual and estimated data. Outcomes of this study can be used in microzonation studies, site response analyses, calculation of bearing capacity of subsoils in the region and producing a number of parameters which are empirically related to corrected SPT number as well.
A Classification for a Geostatistical Index of Spatial Dependence
Directory of Open Access Journals (Sweden)
Enio Júnior Seidel
Full Text Available ABSTRACT: In geostatistical studies, spatial dependence can generally be described by means of the semivariogram or, in complementary form, with a single index followed by its categorization to classify the degree of such dependence. The objective of this study was to construct a categorization for the spatial dependence index (SDI proposed by Seidel and Oliveira (2014 in order to classify spatial variability in terms of weak, moderate, and strong dependence. Theoretical values were constructed from different degrees of spatial dependence, which served as a basis for calculation of the SDI. In view of the form of distribution and SDI descriptive measures, we developed a categorization for posterior classification of spatial dependence, specific to each semivariogram model. The SDI categorization was based on its median and 3rd quartile, allowing us to classify spatial dependence as weak, moderate, or strong. We established that for the spherical semivariogram: SDISpherical (% ≤ 7 % (weak spatial dependence, 7 % 15 % (strong spatial dependence; for the exponential semivariogram: SDIExponential (% ≤ 6 % (weak spatial dependence, 6 % 13 % (strong spatial dependence; and for the Gaussian semivariogram: SDIGaussian (% ≤ 9 % (weak spatial dependence, 9 % 20 % (strong spatial dependence. The proposed categorization allows the user to transform the numerical values calculated for SDI into categories of variability of spatial dependence, with adequate power for explanation and comparison.
Determination of homogeneous zones for liming recommendations of black pepper using geostatistics
Directory of Open Access Journals (Sweden)
Ivoney Gontijo
Full Text Available ABSTRACT Studies aimed at determining homogeneous zones and the spatial variability of soil characteristics may improve the efficiency of agricultural input applications. The purpose of this study was to determine homogeneous zones for liming applications and to characterize the spatial variability of characteristics related to soil acidity and productivity in an Oxisol cultivated with black pepper (Piper nigrum L.. This study was carried out in São Mateus, state of Espírito Santo, Brazil. The experimental site was 100 x 120 m. A grid with 126 sampling points was established. Three soil sub-samples were collected at each sampling point in the black pepper canopy areas, at a 0-0.20 m depth. Crop productivity was estimated by harvesting the three plants neighboring each sampling point. Descriptive statistics and geostatistical analyses were performed. Homogeneous management zones were defined based on map of liming needs. Mathematical models adjusted to semivariograms indicated that all of the studied variables exhibited spatial dependency. An analysis of the spatial variability together with the definition of homogeneous zones can be used to increase the efficiency of soil liming.
Using Predictions Based on Geostatistics to Monitor Trends in Aspergillus flavus Strain Composition.
Orum, T V; Bigelow, D M; Cotty, P J; Nelson, M R
1999-09-01
ABSTRACT Aspergillus flavus is a soil-inhabiting fungus that frequently produces aflatoxins, potent carcinogens, in cottonseed and other seed crops. A. flavus S strain isolates, characterized on the basis of sclerotial morphology, are highly toxigenic. Spatial and temporal characteristics of the percentage of the A. flavus isolates that are S strain (S strain incidence) were used to predict patterns across areas of more than 30 km(2). Spatial autocorrelation in S strain incidence in Yuma County, AZ, was shown to extend beyond field boundaries to adjacent fields. Variograms revealed both short-range (2 to 6 km) and long-range (20 to 30 km) spatial structure in S strain incidence. S strain incidence at 36 locations sampled in July 1997 was predicted with a high correlation between expected and observed values (R = 0.85, P = 0.0001) by kriging data from July 1995 and July 1996. S strain incidence at locations sampled in October 1997 and March 1998 was markedly less than predicted by kriging data from the same months in prior years. Temporal analysis of four locations repeatedly sampled from April 1995 through July 1998 also indicated a major reduction in S strain incidence in the Texas Hill area after July 1997. Surface maps generated by kriging point data indicated a similarity in the spatial pattern of S strain incidence among all sampling dates despite temporal changes in the overall S strain incidence. Geostatistics provided useful descriptions of variability in S strain incidence over space and time.
Usage of multivariate geostatistics in interpolation processes for meteorological precipitation maps
Gundogdu, Ismail Bulent
2017-01-01
Long-term meteorological data are very important both for the evaluation of meteorological events and for the analysis of their effects on the environment. Prediction maps which are constructed by different interpolation techniques often provide explanatory information. Conventional techniques, such as surface spline fitting, global and local polynomial models, and inverse distance weighting may not be adequate. Multivariate geostatistical methods can be more significant, especially when studying secondary variables, because secondary variables might directly affect the precision of prediction. In this study, the mean annual and mean monthly precipitations from 1984 to 2014 for 268 meteorological stations in Turkey have been used to construct country-wide maps. Besides linear regression, the inverse square distance and ordinary co-Kriging (OCK) have been used and compared to each other. Also elevation, slope, and aspect data for each station have been taken into account as secondary variables, whose use has reduced errors by up to a factor of three. OCK gave the smallest errors (1.002 cm) when aspect was included.
Radon risk mapping in southern Belgium: an application of geostatistical and GIS techniques.
Zh, H C; Charlet, J M; Poffijn, A
2001-05-14
A data set of long-term radon measurements in approximately 2200 houses in southern Belgium has been collected in an on-going national radon survey. The spatial variation of indoor Rn concentrations is modelled by variograms. A radon distribution map is produced using the log-normal kriging technique. A GIS is used to digitise, process and integrate a variety of data, including geological maps, Rn concentrations associated with house locations and an administrative map, etc. It also allows evaluation of the relationships between various spatial data sets with the goal of producing radon risk maps. Based on geostatistical mapping and spatial analysis, we define three categories of risk areas: high risk, medium risk and low risk area. The correlation between radon concentrations and geological features is proved in this study. High and medium Rn risk zones are dominantly situated in bedrock from the Cambrian to Lower Devonian, although a few medium risk zones are within the Jurassic. It is evident that high-risk zones are related to a strongly folded and fractured context.
Directory of Open Access Journals (Sweden)
Marcin Kiedrzyński
2014-07-01
Full Text Available Attempts to study biodiversity hotspots on a regional scale should combine compositional and functionalist criteria. The detection of hotspots in this study uses one ecologically similar group of high conservation value species as hotspot indicators, as well as focal habitat indicators, to detect the distribution of suitable environmental conditions. The method is assessed with reference to thermophilous forests in Poland – key habitats for many rare and relict species. Twenty-six high conservation priority species were used as hotspot indicators, and ten plant taxa characteristic of the Quercetalia pubescenti-petraeae phytosociological order were used as focal habitat indicators. Species distribution data was based on a 10 × 10 km grid. The number of species per grid square was interpolated by the ordinary kriging geostatistical method. Our analysis largely determined the distribution of areas with concentration of thermophilous forest flora, but also regional disjunctions and geographical barriers. Indicator species richness can be interpreted as a reflection of the actual state of habitat conditions. It can also be used to determine the location of potential species refugia and possible past and future migration routes.
Campos Campos, Ana Jenssie
2011-01-01
The following article corresponds to the synthesis of a research work about the analysis of the Institutional Evaluation Approach applied to the educational management of a private school in the San José Norte Educational Region. The objectives of the analysis lied in identifying theinstitutional evaluation approach from the characteristics of the three approaches proposed, as well as on determining the dimensions of the approach used, and a third objective was determining the staff perceptio...
Institute of Scientific and Technical Information of China (English)
成伟伟
2016-01-01
As an emerging and multidisciplinary college, University of Applied Technology features the certain kind of applied technology, and the technical talents from the university like this have drew great attention from society. However, how to positioning the talent cultivating objective is very critical to balance the mismatching between talents supply of universities and talents demand of the society. This article focuses on the analysis of existing issue that lays in talent cultivating objective positioning for transitioning from academic universities to technology applied universities for local institutes, and clarification of the characteristics of talent training objective positioning in western developed countries as well. Given this, put forward how domestic universities of applied technology locate and reconstitute talent cultivating objectives.
August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.
Joyce, B; Moxley, R A
1988-01-01
August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.
Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup
2016-02-01
The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis.
Simulation of Solute Flow and Transport in a Geostatistically Generated Fractured Porous System
Assteerawatt, A.; Helmig, R.; Haegland, H.; Bárdossy, A.
2007-12-01
Fractured aquifer systems have provided important natural resources such as petroleum, gas, water and geothermal energy and have also been recently under investigation for their suitability as storage sites for high-level nuclear waste. The resource exploitation and potential utilization have led to extensive studies aiming of understanding, characterizing and finally predicting the behavior of fractured aquifer systems. By applying a discrete model approach to study flow and transport processes, fractures are determined discretely and the effect of individual fractures can be explicitly investigated. The critical step for the discrete model is the generation of a representative fracture network since the development of flow paths within a fractured system strongly depends on its structure. The geostatistical fracture generation (GFG) developed in this study aims to create a representative fracture network, which combines the spatial structures and connectivity of a fracture network, and the statistical distribution of fracture geometries. The spatial characteristics are characterized from indicator fields, which are evaluated from fracture trace maps. A global optimization, Simulated annealing, is utilized as a generation technique and the spatial characteristics are formulated to its objective function. We apply the GFG to a case study at a Pliezhausen field block, which is a sandstone of a high fracture density. The generated fracture network from the GFG are compared with the statistically generated fracture network in term of structure and hydraulic behavior. As the GFG is based on a stochastic concept, several realizations of the same descriptions can be generated, hence, an overall behavior of the fracture-matrix system have to be investigated from various realizations which leads to a problem of computational demand. In order to overcome this problem, a streamline method for a solute transport in a fracture porous system is presented. The results obtained
Directory of Open Access Journals (Sweden)
Dempsey Mary E
2009-06-01
areas led to a significant decrease (~44% in the number of times when the larviciding threshold was reached. This reduction, in turn, resulted in a significant decrease (~74% in the number of larvicide applications in the treatment areas post-project. The remaining larval habitat in the treatment areas had a different geographic distribution and was largely confined to the restored marsh surface (i.e. filled-in mosquito ditches; however only ~21% of the restored marsh surface supported mosquito production. Conclusion The geostatistical analysis showed that OMWM demonstrated considerable potential for effective mosquito control and compatibility with other natural resource management goals such as restoration, wildlife habitat enhancement, and invasive species abatement. GPS and GIS tools are invaluable for large scale project design, data collection, and data analysis, with geostatistical methods serving as an alternative or a supplement to the conventional inference statistics in evaluating the project outcome.
Bayesian geostatistical modeling of Malaria Indicator Survey data in Angola.
Directory of Open Access Journals (Sweden)
Laura Gosoniu
Full Text Available The 2006-2007 Angola Malaria Indicator Survey (AMIS is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60% than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities.
Unsupervised classification of multivariate geostatistical data: Two algorithms
Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques
2015-12-01
With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.
Directory of Open Access Journals (Sweden)
Patrícia Alexandra Gregório Ramos
2013-07-01
Full Text Available Geostatistics has been successfully used to analyse and characterize the spatial variability of environmental properties. Besides providing estimated values at unsampled locations, geostatistics measures the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. This work uses universal block kriging to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign. The aim is to distinguish the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents, which are valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies may be very helpful in the future to validate dispersion models.
Use of geostatistics for remediation planning to transcend urban political boundaries.
Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A
2012-11-01
Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible.
Evaluation of spatial variability of metal bioavailability in soils using geostatistics
DEFF Research Database (Denmark)
Owsianiak, Mikolaj; Hauschild, Michael Zwicky; Rosenbaum, Ralph K.
2012-01-01
for different soils. Here, variography is employed to analyse spatial variability of bioavailability factors (BFs) of metals at the global scale. First, published empirical regressions are employed to calculate BFs of metals for 7180 topsoil profiles. Next, geostatistical interpretation of calculated BFs...... is performed using ArcGIS Geostatistical Analyst. Results show that BFs of copper span a range of 6 orders of magnitude, and have signifficant spatial variability at local and continental scales. The model nugget variance is signifficantly higher than zero, suggesting the presence of spatial variability...... at lags smaller than those in the data set. Geostatistical analyses indicate however, that BFs exhibit no signifficant spatial correlation at a range beyond 3200 km. Because BF is spatially correlated, its values at unsampled locations can be predicted, as demonstrated using ordinary kriggin method...
TiConverter: A training image converting tool for multiple-point geostatistics
Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael
2016-11-01
TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.
DEFF Research Database (Denmark)
Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John
compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, ii) measurement uncertainty, and iii...... a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners......) uncertain source zone and transport parameters. The method generates multiple equally likely realisations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realisations are generated by co-simulating the hydraulic conductivity...
Energy Technology Data Exchange (ETDEWEB)
Souza, Paulo Garcia de [Invensys Brasil Ltda., Sao Paulo, SP (Brazil)
2009-11-01
Stress wave analysis is the technology of data analysis (stress profile - ultrasound spectrum) collected by high-frequency acoustic sensors. Monitoring and analysis of rotating equipment, is a crucial element in predictive maintenance and condition based maintenance projects and, in a broader context, of performance management and optimization of assets. This article discusses the application of stress wave analysis to rotating machines in the context of assets optimization and CBM. (author)
Institute of Scientific and Technical Information of China (English)
Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG
2006-01-01
Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.
Energy Technology Data Exchange (ETDEWEB)
Kolski, Jeffrey S. [Los Alamos National Laboratory; Macek, Robert J. [Los Alamos National Laboratory; McCrady, Rodney C. [Los Alamos National Laboratory; Pang, Xiaoying [Los Alamos National Laboratory
2012-05-14
Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.
DEFF Research Database (Denmark)
Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus
2012-01-01
We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... into account during the inversion. The suggested inversion strategy is tested on synthetic tomographic crosshole ground-penetrating radar full-waveform data using multiple-point-based a priori information. This is, to our knowledge, the first example of obtaining a posteriori realizations of a full......-waveform inverse problem. Benefits of the proposed methodology compared with deterministic inversion approaches include: (1) The a posteriori model variability reflects the states of information provided by the data uncertainties and a priori information, which provides a means of obtaining resolution analysis. (2...
DEFF Research Database (Denmark)
He, Xiulan
parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... was analyzed using both a traditional two-point based geostatistical approach and multiple-point geostatistics (MPS). Our results documented that model structure is as important as model parameter regarding groundwater modeling uncertainty. Under certain circumstances the inaccuracy on model structure can...
Methodology and applications in non-linear model-based geostatistics
DEFF Research Database (Denmark)
Christensen, Ole Fredslund
Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data that are approximat......Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data...
DEFF Research Database (Denmark)
Troldborg, Mads; Nowak, Wolfgang; Lange, Ida Vedel
2012-01-01
. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty......-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical...
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
VON NEUMANN STABILITY ANALYSIS OF SYMPLECTIC INTEGRATORS APPLIED TO HAMILTONIAN PDEs
Institute of Scientific and Technical Information of China (English)
Helen M. Regan
2002-01-01
Symplectic integration of separable Hamiltonian ordinary and partial differential equations is discussed. Avon Neumann analysis is performed to achieve general linear stability criteria for symplectic methods applied to a restricted class of Hamiltonian PDEs. In this treatment, the symplectic step is performed prior to the spatial step, as opposed to the standard approach of spatially discretising the PDE to form a system of Hamiltonian ODEs to which a symplectic integrator can be applied. In this way stability criteria are achieved by considering the spectra of linearised Hamiltonian PDEs rather thanspatial step size.
Ossenkopf, V; Stutzki, J
2008-01-01
The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. In paper I we proposed essential improvements to the Delta-variance analysis. In this paper we apply the improved Delta-variance analysis to i) a hydrodynamic turbulence simulation with prominent density and velocity structures, ii) an observed intensity map of rho Oph with irregular boundaries and variable uncertainties of the different data points, and iii) a map of the turbulent velocity structure in the Polaris Flare affected by the intensity dependence on the centroid velocity determination. The tests confirm the extended capabilities of the improved Delta-variance analysis. Prominent spatial scales were accurately identified and artifacts from a variable reliability of the data were removed. The analysis of the hydrodynamic simulations showed that the injection of a turbulent velocity structure creates the most prominent density structures are produced on a sca...
Institute of Scientific and Technical Information of China (English)
秦涛
2005-01-01
主流GIS软件ArcGIS 9的Geostatistics Analyst模块中所涉及的两大类空间内插方法:确定性内插方法和地统计内插方法.结合该软件对各种插值方法的应用和处理进行了介绍,应用示例比较各种内插方法的适用范围.
A connectionist-geostatistical approach for classification of deformation types in ice surfaces
Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.
2014-12-01
Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.
Caseri, Angelica; Ramos, Maria Helena; Javelle, Pierre; Leblois, Etienne
2016-04-01
Floods are responsible for a major part of the total damage caused by natural disasters. Nowcasting systems providing public alerts to flash floods are very important to prevent damages from extreme events and reduce their socio-economic impacts. The major challenge of these systems is to capture high-risk situations in advance, with good accuracy in the intensity, location and timing of future intense precipitation events. Flash flood forecasting has been studied by several authors in different affected areas. The majority of the studies combines rain gauge data with radar imagery advection to improve prediction for the next few hours. Outputs of Numerical Weather Prediction (NWP) models have also been increasingly used to predict ensembles of extreme precipitation events that might trigger flash floods. One of the challenges of the use of NWP for ensemble nowcasting is to successfully generate ensemble forecasts of precipitation in a short time calculation period to enable the production of flood forecasts with sufficient advance to issue flash flood alerts. In this study, we investigate an alternative space-time geostatistical framework to generate multiple scenarios of future rainfall for flash floods nowcasting. The approach is based on conditional simulation and an advection method applied within the Turning Bands Method (TBM). Ensemble forecasts of precipitation fields are generated based on space-time properties given by radar images and precipitation data collected from rain gauges during the development of the rainfall event. The results show that the approach developed can be an interesting alternative to capture precipitation uncertainties in location and intensity and generate ensemble forecasts of rainfall that can be useful to improve alerts for flash floods, especially in small areas.
Anderson, Cynthia M; Kincaid, Donald
2005-01-01
School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis.
Stability analysis of multi-infeed HVDC system applying VSC-HVDC
DEFF Research Database (Denmark)
Liu, Yan; Chen, Zhe
2010-01-01
/EMTDC to verify the theoretical analysis. Simulation results indicate that this dual infeed HVDC system can realize higher stability than single infeed HVDC system. And different control strategies on a VSC-HVDC link may result in different influence on AC voltage and active power oscillation during transient...... on this model is analyzed under steady state and different kinds of grid fault transient situations. Two main control methods applied on VSC-HVDC link in this dual infeed HVDC system are investigated, and comparative analysis of them under transient situation is presented. A simulation model is built in PSCAD......This paper presents a general model of dual infeed HVDC system applying VSC-HVDC, which can be used as an element in large multi infeed HVDC system. The model may have different structure under different grid faults because of the action of breakers. Hence power flow of the system based...
Common cause evaluations in applied risk analysis of nuclear power plants. [PWR
Energy Technology Data Exchange (ETDEWEB)
Taniguchi, T.; Ligon, D.; Stamatelatos, M.
1983-04-01
Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.
TAPPS Release 1: Plugin-Extensible Platform for Technical Analysis and Applied Statistics
Directory of Open Access Journals (Sweden)
Justin Sam Chew
2016-01-01
Full Text Available We present the first release of TAPPS (Technical Analysis and Applied Statistics System; a Python implementation of a thin software platform aimed towards technical analyses and applied statistics. The core of TAPPS is a container for 2-dimensional data frame objects and a TAPPS command language. TAPPS language is not meant to be a programming language for script and plugin development but for the operational purposes. In this aspect, TAPPS language takes on the flavor of SQL rather than R, resulting in a shallower learning curve. All analytical functions are implemented as plugins. This results in a defined plugin system, which enables rapid development and incorporation of analysis functions. TAPPS Release 1 is released under GNU General Public License 3 for academic and non-commercial use. TAPPS code repository can be found at http://github.com/mauriceling/tapps.
Directory of Open Access Journals (Sweden)
Renata C. B. Madeo
2013-08-01
Full Text Available Lately, there has been an increasinginterest in hand gesture analysis systems. Recent works have employedpattern recognition techniques and have focused on the development of systems with more natural userinterfaces. These systems may use gestures to control interfaces or recognize sign language gestures, whichcan provide systems with multimodal interaction; or consist in multimodal tools to help psycholinguists tounderstand new aspects of discourse analysis and to automate laborious tasks.Gestures are characterizedby several aspects, mainly by movementsand sequence of postures. Since data referring to movementsorsequencescarry temporal information, this paper presents aliteraturereviewabouttemporal aspects ofhand gesture analysis, focusing on applications related to natural conversation and psycholinguisticanalysis, using Systematic Literature Review methodology. In our results, we organized works according totype of analysis, methods, highlighting the use of Machine Learning techniques, and applications.
MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.
Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben
2016-01-01
Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .
Directory of Open Access Journals (Sweden)
Síglia Pimentel Höher Camargo
2013-11-01
Full Text Available Autism spectrum disorder (ASD is a lifelong pervasive developmental disorder with no known causes and cure. However, educational and behavioral interventions with a foundation in applied behavior analysis (ABA have been shown to improve a variety of skill areas such as communication, social, academic, and adaptive behaviors of individuals with ASD. The goal of this work is to present the definition, features and philosophical concepts that underlie ABA and make this science an effective intervention method for people with autism.
Applied analysis of ventilation technology in residential buildings in Changjiang river valley
Institute of Scientific and Technical Information of China (English)
杨露露; 卢军; 王曦; 甘灵丽
2009-01-01
Making the use of ventilation technology may decrease building energy consumption,improve indoor thermal environment,and ameliorate indoor air quality. Combining with the meteorological characteristics in the Changjiang river valley and focusing on Chongqing,this work makes an applied analysis of the feasibility of intermittent mechanical ventilation. By comparison of various ventilation modes,it gives a summary of the suitable ventilation ways for different weather conditions with the combination of testing data and experimental data.
地质统计学在固体矿山中的应用%Application of Geostatistics in Solid Mine
Institute of Scientific and Technical Information of China (English)
刘焕荣; 燕永锋; 杨海涛
2013-01-01
As a rising and crossed subject,the geostatistics has obtained the development greatly during nearly 50 years of research and practice,which was also known as spatial information statistics in recent years.Producing practice,both at home and abroad,showed that geostatistics study has obvious advantages in geoscience,and has been greatly used in the study of solid mine.This paper mainly introduced the geostatistics applied in many ways such as the calculation of the re-serves of the mineral resources,the distributional characteristics of the minerals,the classification of the reserves,the opti-mization of the determine exploratory grid and the investigation of the minerals.%地质统计学作为一门新兴的交叉学科，在近50年的研究和实践中得到了很大的发展，近年来又被称为空间信息统计学。国内外的生产实践表明，地质统计学除了在地学科研方面具有明显的优越性，在固体矿山中的应用也越来越广泛。本文主要介绍了地质统计学在矿产资源储量计算、矿产分布特征、储量分类、勘探网度优化及矿产勘查等方面的应用。
RGB photoelasticity applied to the analysis of membrane residual stress in glass
Ajovalasit, A.; Petrucci, G.; Scafidi, M.
2012-02-01
The measurement of residual stresses is of great relevance in the glass industry. The analysis of residual stress in glass is usually made by photoelastic methods because glass is a photoelastic material. This paper considers the determination of membrane residual stresses in glass plates by automatic digital photoelasticity in white light (RGB photoelasticity). The proposed method is applied to the analysis of membrane residual stresses in some tempered glass. The proposed method can effectively replace manual methods based on the use of white light, which are currently provided by some technical standards.
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-09-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
The evolution of applied harmonic analysis models of the real world
Prestini, Elena
2016-01-01
A sweeping exploration of the development and far-reaching applications of harmonic analysis such as signal processing, digital music, optics, radio astronomy, crystallography, medical imaging, spectroscopy, and more. Featuring a wealth of illustrations, examples, and material not found in other harmonic analysis books, this unique monograph skillfully blends together historical narrative with scientific exposition to create a comprehensive yet accessible work. While only an understanding of calculus is required to appreciate it, there are more technical sections that will charm even specialists in harmonic analysis. From undergraduates to professional scientists, engineers, and mathematicians, there is something for everyone here. The second edition of The Evolution of Applied Harmonic Analysis contains a new chapter on atmospheric physics and climate change, making it more relevant for today’s audience. Praise for the first edition: "…can be thoroughly recommended to any reader who is curious about the ...
AASC Recommendations for the Education of an Applied Climatologist
Nielsen-Gammon, J. W.; Stooksbury, D.; Akyuz, A.; Dupigny-Giroux, L.; Hubbard, K. G.; Timofeyeva, M. M.
2011-12-01
The American Association of State Climatologists (AASC) has developed curricular recommendations for the education of future applied and service climatologists. The AASC was founded in 1976. Membership of the AASC includes state climatologists and others who work in state climate offices; climate researchers in academia and educators; applied climatologists in NOAA and other federal agencies; and the private sector. The AASC is the only professional organization dedicated solely to the growth and development of applied and service climatology. The purpose of the recommendations is to offer a framework for existing and developing academic climatology programs. These recommendations are intended to serve as a road map and to help distinguish the educational needs for future applied climatologists from those of operational meteorologists or other scientists and practitioners. While the home department of climatology students may differ from one program to the next, the most essential factor is that students can demonstrate a breadth and depth of understanding in the knowledge and tools needed to be an applied climatologist. Because the training of an applied climatologist requires significant depth and breadth, the Masters degree is recommended as the minimum level of education needed. This presentation will highlight the AASC recommendations. These include a strong foundation in: - climatology (instrumentation and data collection, climate dynamics, physical climatology, synoptic and regional climatology, applied climatology, climate models, etc.) - basic natural sciences and mathematics including calculus, physics, chemistry, and biology/ecology - fundamental atmospheric sciences (atmospheric dynamics, atmospheric thermodynamics, atmospheric radiation, and weather analysis/synoptic meteorology) and - data analysis and spatial analysis (descriptive statistics, statistical methods, multivariate statistics, geostatistics, GIS, etc.). The recommendations also include a
PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems
Directory of Open Access Journals (Sweden)
Tharcius Augusto Pivetta
2016-07-01
Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.
Directory of Open Access Journals (Sweden)
Avdakovic Samir
2014-08-01
Full Text Available Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real power distribution transformer substation in urban part of Sarajevo city. Using the continuous wavelet transform (CWT with wavelet power spectrum and global wavelet spectrum some properties of analysed time series are determined. Then, empirical mode decomposition (EMD and Hilbert-Huang Transform (HHT are applied for the analyses of the same time series and the results showed that both applied approaches can provide very useful information about the behaviour of power consumption for observed time interval and different period (frequency bands. Also it can be noticed that the results obtained by global wavelet spectrum and marginal Hilbert spectrum are very similar, thus confirming that both approaches could be used for identification of main properties of active and reactive power consumption time series.
System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO
Olds, John R.
1994-01-01
This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.
Directory of Open Access Journals (Sweden)
V. Yadav
2010-02-01
Full Text Available A coupled Bayesian model selection and geostatistical regression modeling approach is adopted for empirical analysis of gross primary productivity (GPP at six AmeriFlux sites, including the Kennedy Space Center Scrub Oak, Vaira Ranch, Tonzi Ranch, Blodgett Forest, Morgan Monroe State Forest, and Harvard Forest sites. The analysis is performed at a continuum of temporal scales ranging from daily to monthly, for a period of seven years. A total of 10 covariates representing environmental stimuli and indices of plant physiology are considered in explaining variations in GPP. Similar to other statistical methods, the proposed approach estimates regression coefficients and uncertainties associated with the covariates in a selected regression model. However, unlike traditional regression methods, the presented approach also estimates the uncertainty associated with the selection of a single "best" model of GPP. In addition, the approach provides an enhanced understanding of how the importance of specific covariates changes with temporal resolutions. An examination of trends in the importance of specific covariates reveals scaling thresholds above or below which covariates become significant in explaining GPP. Results indicate that most sites (especially those with a stronger seasonal cycle exhibit at least one prominent scaling threshold between daily to 20-day temporal scale. This demonstrates that environmental variables that explain GPP at synoptic scales are different from those that capture its seasonality. At shorter time scales, radiation, temperature, and vapor pressure deficit exert most significant influence on GPP at most examined sites. However, at coarser time scales, the importance of these covariates in explaining GPP declines. Overall, unique best models are identified at most sites at the daily scale, whereas multiple competing models are identified at larger time scales. In addition, the selected models are able to explain a larger
Directory of Open Access Journals (Sweden)
V. Yadav
2010-09-01
Full Text Available A coupled Bayesian model selection and geostatistical regression modeling approach is adopted for empirical analysis of gross primary productivity (GPP at six AmeriFlux sites, including the Kennedy Space Center Scrub Oak, Vaira Ranch, Tonzi Ranch, Blodgett Forest, Morgan Monroe State Forest, and Harvard Forest sites. The analysis is performed at a continuum of temporal scales ranging from daily to monthly, for a period of seven years. A total of 10 covariates representing environmental stimuli and indices of plant physiology are considered in explaining variations in GPP. Similarly to other statistical methods, the presented approach estimates regression coefficients and uncertainties associated with the covariates in a selected regression model. Unlike traditional regression methods, however, the approach also estimates the uncertainty associated with the selection of a single "best" model of GPP. In addition, the approach provides an enhanced understanding of how the importance of specific covariates changes with the examined timescale (i.e. temporal resolution. An examination of changes in the importance of specific covariates across timescales reveals thresholds above or below which covariates become important in explaining GPP. Results indicate that most sites (especially those with a stronger seasonal cycle exhibit at least one prominent scaling threshold between the daily and 20-day temporal scales. This demonstrates that environmental variables that explain GPP at synoptic scales are different from those that capture its seasonality. At shorter time scales, radiation, temperature, and vapor pressure deficit exert the most significant influence on GPP at most examined sites. At coarser time scales, however, the importance of these covariates in explaining GPP declines. Overall, unique best models are identified at most sites at the daily scale, whereas multiple competing models are identified at longer time scales.
Adapting geostatistics to analyze spatial and temporal trends in weed populations
Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...
Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model
Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef
2016-10-01
We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.
Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation
Minasny, B.; Vrugt, J.A.; McBratney, A.B.
2011-01-01
This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior distributi
Energy Technology Data Exchange (ETDEWEB)
Fuente Martin, P.; Gonzalez Marroquin, V.M.; Fernandez de Castro Fernandez Sahw, F.; Saez Garcia, E. (HUNOSA, Oviedo (Spain))
1989-06-01
The aim of this project, which has been financed by Ocicarbon, is to develop both in theory and in practice, the use of geostatistics to predict the geological behaviour of coal seams, in virgin panels, using data from panels already worked. Examples of seams selected for full mechanisation are given. 3 figs., 3 tabs.
Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics
Directory of Open Access Journals (Sweden)
J.D. Clayton
2016-08-01
Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.
Directory of Open Access Journals (Sweden)
S. W. Lyon
2006-01-01
Full Text Available Shallow water tables near-streams often lead to saturated, overland flow generating areas in catchments in humid climates. While these saturated areas are assumed to be principal biogeochemical hot-spots and important for issues such as non-point pollution sources, the spatial and temporal behavior of shallow water tables, and associated saturated areas, is not completely understood. This study demonstrates how geostatistical methods can be used to characterize the spatial and temporal variation of the shallow water table for the near-stream region. Event-based and seasonal changes in the spatial structure of the shallow water table, which influences the spatial pattern of surface saturation and related runoff generation, can be identified and used in conjunction to characterize the hydrology of an area. This is accomplished through semivariogram analysis and indicator kriging to produce maps combining soft data (i.e., proxy information to the variable of interest representing general shallow water table patterns with hard data (i.e., actual measurements that represent variation in the spatial structure of the shallow water table per rainfall event. The area used was a hillslope in the Catskill Mountains region of New York State. The shallow water table was monitored for a 120 m×180 m near-stream region at 44 sampling locations on 15-min intervals. Outflow of the area was measured at the same time interval. These data were analyzed at a short time interval (15 min and at a long time interval (months to characterize the changes in the hydrologic behavior of the hillslope. Indicator semivariograms based on binary-transformed ground water table data (i.e., 1 if exceeding the time-variable median depth to water table and 0 if not were created for both short and long time intervals. For the short time interval, the indicator semivariograms showed a high degree of spatial structure in the shallow water table for the spring, with increased range
Energy Technology Data Exchange (ETDEWEB)
Vejbaek, O.V.
1998-12-31
The aim of this report was to demonstrate possible uses of seismic impedances as soft data for reservoir characterization. To illustrate the impact of the results and attempt to calculate oil in place was also carried out. It must, however, be emphasized that these results only apply to the Callovian portion of the Middle Jurassic West Lulu reservoir, and thus do not provide estimates of the entire Middle Jurassic West Lulu accumulation. It is important to realise that stochastic simulations does not provide exact predictions in areas outside the control of hard data. It is, however, offering possibilities to exploit every known or surmised property about the desired (target) data population. These properties include f.ex., mean, spread, spatial continuity (measured by variograms), horixontal and vertical trends, correlation to supporting soft data (e.g. seismic impedances) etc. Neither are predictions exact even through the term `narrowed solution space` is applied. This term merely implies that the error in prediction at any point may be less than the full range of the parameter. The quality of the predictions mainly depend on meticulous handling of data, avoiding errors like bad stratigraphic alignment of the data, obtaining good variograms, avoiding errors in the construction of the target populations and including every pertinent attribute about the data. The result is thus also depending on a full geological understanding of the problem (and moral of the modeller). The most important quality is the ability to provide a great number of equi-probable realisation that equally well satisfies any known or surmised property about the target data population. The goal of this study was to investigate the use of inversion derived seismic impedances for geostatistical reservoir characterisation in a complex clastic reservoir exemplified with the West Lulu reservoir of the Harald Field. The well database is rather modest, so substantial support has been gained from the
Rhodes, Elena M; Liburd, Oscar E; Grunwald, Sabine
2011-08-01
Flower thrips (Frankliniella spp.) are one of the key pests of southern highbush blueberries (Vaccinium corymbosum L. x V. darrowii Camp), a high-value crop in Florida. Thrips' feeding and oviposition injury to flowers can result in fruit scarring that renders the fruit unmarketable. Flower thrips often form areas of high population, termed "hot spots", in blueberry plantings. The objective of this study was to model thrips spatial distribution patterns with geostatistical techniques. Semivariogram models were used to determine optimum trap spacing and two commonly used interpolation methods, inverse distance weighting (IDW) and ordinary kriging (OK), were compared for their ability to model thrips spatial patterns. The experimental design consisted of a grid of 100 white sticky traps spaced at 15.24-m and 7.61-m intervals in 2008 and 2009, respectively. Thirty additional traps were placed randomly throughout the sampling area to collect information on distances shorter than the grid spacing. The semivariogram analysis indicated that, in most cases, spacing traps at least 28.8 m apart would result in spatially independent samples. Also, the 7.61-m grid spacing captured more of the thrips spatial variability than the 15.24-m grid spacing. IDW and OK produced maps with similar accuracy in both years, which indicates that thrips spatial distribution patterns, including "hot spots," can be modeled using either interpolation method. Future studies can use this information to determine if the formation of "hot spots" can be predicted using flower density, temperature, and other environmental factors. If so, this development would allow growers to spot treat the "hot spots" rather than their entire field.
Yorgun, M. S.; Rood, R. B.
2010-12-01
The behavior of atmospheric models is sensitive to the algorithms that are used to represent the equations of motion. Typically, comprehensive models are conceived in terms of the resolved fluid dynamics (i.e. the dynamical core) and subgrid, unresolved physics represented by parameterizations. Deterministic weather predictions are often validated with feature-by-feature comparison. Probabilistic weather forecasts and climate projects are evaluated with statistical methods. We seek to develop model evaluation strategies that identify like “objects” - coherent systems with an associated set of measurable parameters. This makes it possible to evaluate processes in models without needing to reproduce the time and location of, for example, a particular observed cloud system. Process- and object-based evaluation preserves information in the observations by avoiding the need for extensive spatial and temporal averaging. As a concrete example, we focus on analyzing how the choice of dynamical core impacts the representation of precipitation in the Pacific Northwest of the United States, Western Canada, and Alaska; this brings attention to the interaction of the resolved and the parameterized components of the model. Two dynamical cores are considered within the Community Atmosphere Model. These are the Spectral (Eulerian), which relies on global basis functions and the Finite Volume (FV), which uses only local information. We introduce the concept of "meteorological realism" that is, do local representations of large-scale phenomena, for example, fronts and orographic precipitation, look like the observations? A follow on question is, does the representation of these phenomena improve with resolution? Our approach to quantify meteorological realism starts with methods of geospatial statistics. Specifically, we employ variography, which is a geostatistical method which is used to measure the spatial continuity of a regionalized variable, and principle component
Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.
Welch, Christie D; Polatajko, H J
2016-01-01
Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically.
The Renormalization-Group Method Applied to Asymptotic Analysis of Vector Fields
Kunihiro, T
1996-01-01
The renormalization group method of Goldenfeld, Oono and their collaborators is applied to asymptotic analysis of vector fields. The method is formulated on the basis of the theory of envelopes, as was done for scalar fields. This formulation actually completes the discussion of the previous work for scalar equations. It is shown in a generic way that the method applied to equations with a bifurcation leads to the Landau-Stuart and the (time-dependent) Ginzburg-Landau equations. It is confirmed that this method is actually a powerful theory for the reduction of the dynamics as the reductive perturbation method is. Some examples for ordinary diferential equations, such as the forced Duffing, the Lotka-Volterra and the Lorenz equations, are worked out in this method: The time evolution of the solution of the Lotka-Volterra equation is explicitly given, while the center manifolds of the Lorenz equation are constructed in a simple way in the RG method.
An uncertainty analysis of the PVT gauging method applied to sub-critical cryogenic propellant tanks
Energy Technology Data Exchange (ETDEWEB)
Van Dresar, Neil T. [NASA Glenn Research Center, Cleveland, OH (United States)
2004-08-01
The PVT (pressure, volume, temperature) method of liquid quantity gauging in low-gravity is based on gas law calculations assuming conservation of pressurant gas within the propellant tank and the pressurant supply bottle. There is interest in applying this method to cryogenic propellant tanks since the method requires minimal additional hardware or instrumentation. To use PVT with cryogenic fluids, a non-condensable pressurant gas (helium) is required. With cryogens, there will be a significant amount of propellant vapor mixed with the pressurant gas in the tank ullage. This condition, along with the high sensitivity of propellant vapor pressure to temperature, makes the PVT method susceptible to substantially greater measurement uncertainty than is the case with less volatile propellants. A conventional uncertainty analysis is applied to example cases of liquid hydrogen and liquid oxygen tanks. It appears that the PVT method may be feasible for liquid oxygen. Acceptable accuracy will be more difficult to obtain with liquid hydrogen. (Author)
Social Network Analysis and Big Data tools applied to the Systemic Risk supervision
Directory of Open Access Journals (Sweden)
Mari-Carmen Mochón
2016-03-01
Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.
Analysis of possibility of applying the PVDF foil in industrial vibration sensors
Wróbel, A.
2015-11-01
There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.
A comparative analysis of three metaheuristic methods applied to fuzzy cognitive maps learning
Directory of Open Access Journals (Sweden)
Bruno A. Angélico
2013-12-01
Full Text Available This work analyses the performance of three different population-based metaheuristic approaches applied to Fuzzy cognitive maps (FCM learning in qualitative control of processes. Fuzzy cognitive maps permit to include the previous specialist knowledge in the control rule. Particularly, Particle Swarm Optimization (PSO, Genetic Algorithm (GA and an Ant Colony Optimization (ACO are considered for obtaining appropriate weight matrices for learning the FCM. A statistical convergence analysis within 10000 simulations of each algorithm is presented. In order to validate the proposed approach, two industrial control process problems previously described in the literature are considered in this work.
Institute of Scientific and Technical Information of China (English)
陈锦赋
2012-01-01
克里格法（Kriging）是地质统计学的主要内容之一，从统计意义上说，是从变量关系和变异性出发，在有限区域内对区域化变量的取值进行无偏、最优估计的一种方法：从插值角度讲是对空间分布的数据求线性最优、无偏内插估计的一种方法。克里格法的适用条件是区域化变量存在空间相关性。将临安市内的二氧化碳作为区域化变量．根据不同的半变异函数理论模型，采用普通Kriging法，通过对比分析得到：基于地统计的插值方法。根据半变异函数云图和试验方差最小的原理．选择合适的半变异函数理论模型进行变量的空间插值．能够较好地模拟区域化变量的空间连续分布格局．并取得较好效果。%Kriging is the main content of geostatistics, from the statistical sense, Kriging start from the variables and variability, a method of unbiased and the optimal estimation in a limited area about the value of regionalized variables. From the interpolation sense, Kriging is a method of linear optimal and Unbiased interpolation estimate to the space distribution of data. The appli- cable condition of Kriging is that regionalized variables have spatial correlation. With the carbon dioxide of Linan as regionalized variables, according to different half variation functions theory model, adopts the common method of Kriging, through the comparative analysis we can get ： the interpolation method based on the geostatistics, according to the theory of Half a variation functions cloud chart and the minimum test variance, chooses the right half of the variograms model to spatial interpolation for variable, can simulate the space continuous distribution pattern of regionalized variables, and gets a good effect.
Geostatistical independent simulation of spatially correlated soil variables
Boluwade, Alaba; Madramootoo, Chandra A.
2015-12-01
The selection of best management practices to reduce soil and water pollution often requires estimation of soil properties. It is important to find an efficient and robust technique to simulate spatially correlated soils parameters. Co-kriging and co-simulation are techniques that can be used. These methods are limited in terms of computer simulation due to the problem of solving large co-kriging systems and difficulties in fitting a valid model of coregionalization. The order of complexity increases as the number of covariables increases. This paper presents a technique for the conditional simulation of a non-Gaussian vector random field on point support scale. The technique is termed Independent Component Analysis (ICA). The basic principle underlining ICA is the determination of a linear representation of non-Gaussian data so that the components are considered statistically independent. With such representation, it would be easy and more computationally efficient to develop direct variograms for the components. The process is presented in two stages. The first stage involves the ICA decomposition. The second stage involves sequential Gaussian simulation of the generated components (which are derived from the first stage). This technique was applied for spatially correlated extractable cations such as magnesium (Mg) and iron (Fe) in a Canadian watershed. This paper has a strong application in stochastic quantification of uncertainties of soil attributes in soil remediation and soil rehabilitation.
Building on crossvalidation for increasing the quality of geostatistical modeling
Olea, R.A.
2012-01-01
The random function is a mathematical model commonly used in the assessment of uncertainty associated with a spatially correlated attribute that has been partially sampled. There are multiple algorithms for modeling such random functions, all sharing the requirement of specifying various parameters that have critical influence on the results. The importance of finding ways to compare the methods and setting parameters to obtain results that better model uncertainty has increased as these algorithms have grown in number and complexity. Crossvalidation has been used in spatial statistics, mostly in kriging, for the analysis of mean square errors. An appeal of this approach is its ability to work with the same empirical sample available for running the algorithms. This paper goes beyond checking estimates by formulating a function sensitive to conditional bias. Under ideal conditions, such function turns into a straight line, which can be used as a reference for preparing measures of performance. Applied to kriging, deviations from the ideal line provide sensitivity to the semivariogram lacking in crossvalidation of kriging errors and are more sensitive to conditional bias than analyses of errors. In terms of stochastic simulation, in addition to finding better parameters, the deviations allow comparison of the realizations resulting from the applications of different methods. Examples show improvements of about 30% in the deviations and approximately 10% in the square root of mean square errors between reasonable starting modelling and the solutions according to the new criteria. ?? 2011 US Government.
Botelho, Fabio
2014-01-01
This book introduces the basic concepts of real and functional analysis. It presents the fundamentals of the calculus of variations, convex analysis, duality, and optimization that are necessary to develop applications to physics and engineering problems. The book includes introductory and advanced concepts in measure and integration, as well as an introduction to Sobolev spaces. The problems presented are nonlinear, with non-convex variational formulation. Notably, the primal global minima may not be attained in some situations, in which cases the solution of the dual problem corresponds to an appropriate weak cluster point of minimizing sequences for the primal one. Indeed, the dual approach more readily facilitates numerical computations for some of the selected models. While intended primarily for applied mathematicians, the text will also be of interest to engineers, physicists, and other researchers in related fields.
Valente, Thomas W; Pitts, Stephanie R
2017-03-20
The use of social network theory and analysis methods as applied to public health has expanded greatly in the past decade, yielding a significant academic literature that spans almost every conceivable health issue. This review identifies several important theoretical challenges that confront the field but also provides opportunities for new research. These challenges include (a) measuring network influences, (b) identifying appropriate influence mechanisms, (c) the impact of social media and computerized communications, (d) the role of networks in evaluating public health interventions, and (e) ethics. Next steps for the field are outlined and the need for funding is emphasized. Recently developed network analysis techniques, technological innovations in communication, and changes in theoretical perspectives to include a focus on social and environmental behavioral influences have created opportunities for new theory and ever broader application of social networks to public health topics.
Graphical Analysis of PET Data Applied to Reversible and Irreversible Tracers
Energy Technology Data Exchange (ETDEWEB)
Logan, Jean
1999-11-18
Graphical analysis refers to the transformation of multiple time measurements of plasma and tissue uptake data into a linear plot, the slope of which is related to the number of available tracer binding sites. This type of analysis allows easy comparisons among experiments. No particular model structure is assumed, however it is assumed that the tracer is given by bolus injection and that both tissue uptake and the plasma concentration of unchanged tracer are monitored following tracer injection. The requirement of plasma measurements can be eliminated in some cases when a reference region is available. There are two categories of graphical methods which apply to two general types of ligands--those which bind reversibly during the scanning procedure and those which are irreversible or trapped during the time of the scanning procedure.
Analysis of the concept of nursing educational technology applied to the patient
Directory of Open Access Journals (Sweden)
Aline Cruz Esmeraldo Áfio
2014-04-01
Full Text Available It is aimed at analyzing the concept of educational technology, produced by nursing, applied to the patient. Rodgers´ Evolutionary Method of Concept Analysis was used, identifying background, attributes and consequential damages. 13 articles were selected for analysis in which the background was identified: knowledge deficiency, shortage of nursing professionals' time, to optimize nursing work, the need to achieve the goals of the patients. Attributes: tool, strategy, innovative approach, pedagogical approach, mediator of knowledge, creative way to encourage the acquisition of skills, health production instrument. Consequences: to improve the quality of life, encouraging healthy behavior, empowerment, reflection and link. It emphasizes the importance of educational technologies for the care in nursing, to boost health education activities.
Selection of Forklift Unit for Warehouse Operation by Applying Multi-Criteria Analysis
Directory of Open Access Journals (Sweden)
Predrag Atanasković
2013-07-01
Full Text Available This paper presents research related to the choice of the criteria that can be used to perform an optimal selection of the forklift unit for warehouse operation. The analysis has been done with the aim of exploring the requirements and defining relevant criteria that are important when investment decision is made for forklift procurement, and based on the conducted research by applying multi-criteria analysis, to determine the appropriate parameters and their relative weights that form the input data and database for selection of the optimal handling unit. This paper presents an example of choosing the optimal forklift based on the selected criteria for the purpose of making the relevant investment decision.
A comparative assessment of texture analysis techniques applied to bone tool use-wear
Watson, Adam S.; Gleason, Matthew A.
2016-06-01
The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.
Goltz, S M
2000-01-01
Decision fiascoes such as escalation of commitment, the tendency of decision makers to "throw good money after bad," can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis.
多点地质统计学研究进展与展望%Progress and prospect of multiple-point geostatistics
Institute of Scientific and Technical Information of China (English)
尹艳树; 张昌民; 李玖勇; 石书缘
2011-01-01
在简要回顾多点地质统计学起源后,介绍了多点地质统计学的3种方法,并总结了多点地质统计学的研究进展.在应用领域,已经从河流相建模发展到扇环境建模,从储集层结构建模发展到储集层物性分布模拟,从宏观地质体预测发展到微观孔喉分布建模,从地质研究发展到地质统计反演.在整合信息建模方面,给出了3种综合地震属性的方法;在算法方面,提出了PRTT实时处理方法,完善了多点地质统计学建模,并开发了新的多点统计生长算法(Growthsim).对多点地质统计学未来发展进行了展望,指出在训练图像、数据综合以及建模方法耦合方面还需要进一步深入研究.%The paper summarizes the progress of multiple point geostatistics. First the origin of multiple point geostatistics is introduced and the theories of three main multiple-point geostatistic methods are analyzed. Then, the development status is concluded in three aspects. Firstly, in real reservoir modeling domain, the modeling environment is from fluvial to fan facies; the modeling content is from reservoir architecture to reservoir petrophysical property; the modeling scale is from large geologic deposits to micro pores and throats; the modeling region is from geological modeling to geological statistic inversion. Secondly, in integrating multi-disciplines modeling domain, there are three methods for integrating seismic data. Thirdly, in modeling methods domain, there are some improvements and new methods such as PRTT and the Growthsim. Based on the analysis of the development of multiple point geostatistics, the paper points out that the training image, data integration and modeling coupration are the main aims in further studies.
Directory of Open Access Journals (Sweden)
Varun Kumar Ojha
2012-08-01
Full Text Available The article presents performance analysis of a real valued neuro genetic algorithm applied for thedetection of proportion of the gases found in manhole gas mixture. The neural network (NN trained usinggenetic algorithm (GA leads to concept of neuro genetic algorithm, which is used for implementing anintelligent sensory system for the detection of component gases present in manhole gas mixture Usually amanhole gas mixture contains several toxic gases like Hydrogen Sulfide, Ammonia, Methane, CarbonDioxide, Nitrogen Oxide, and Carbon Monoxide. A semiconductor based gas sensor array used for sensingmanhole gas components is an integral part of the proposed intelligent system. It consists of many sensorelements, where each sensor element is responsible for sensing particular gas component. Multiple sensorsof different gases used for detecting gas mixture of multiple gases, results in cross-sensitivity. The crosssensitivity is a major issue and the problem is viewed as pattern recognition problem. The objective of thisarticle is to present performance analysis of the real valued neuro genetic algorithm which is applied formultiple gas detection.
Directory of Open Access Journals (Sweden)
Varun Kumar Ojha
2012-07-01
Full Text Available The article presents performance analysis of a real valued neuro genetic algorithm applied for the detection of proportion of the gases found in manhole gas mixture. The neural network (NN trained using genetic algorithm (GA leads to concept of neuro genetic algorithm, which is used for implementing an intelligent sensory system for the detection of component gases present in manhole gas mixture Usually a manhole gas mixture contains several toxic gases like Hydrogen Sulfide, Ammonia, Methane, Carbon Dioxide, Nitrogen Oxide, and Carbon Monoxide. A semiconductor based gas sensor array used for sensing manhole gas components is an integral part of the proposed intelligent system. It consists of many sensor elements, where each sensor element is responsible for sensing particular gas component. Multiple sensors of different gases used for detecting gas mixture of multiple gases, results in cross-sensitivity. The crosssensitivity is a major issue and the problem is viewed as pattern recognition problem. The objective of this article is to present performance analysis of the real valued neuro genetic algorithm which is applied for multiple gas detection.
Adding value in oil and gas by applying decision analysis methodologies: case history
Energy Technology Data Exchange (ETDEWEB)
Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)
2008-07-01
Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)
Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.
Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic
2017-02-16
Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes.
Yoo, Doo Han; Lee, Jae Shin
2016-07-01
[Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.
Directory of Open Access Journals (Sweden)
Giuseppe Bonazzi
2014-01-01
Full Text Available Cooperatives are one of the most important types of companies in the agricultural sector. Cooperatives allow overcoming the limitations of the fragmentation of agricultural property, increasing the level of production of small-sized farms and selling the product so that it reaches a sufficient critical mass. Moreover, cooperatives are often characterized by undercapitalization and even difficult credit access because banks conduct their analysis applying rating systems that do not take into account the typicality of the cooperative budget. To assess this topic, in this article, an analysis has been conducted on a sample of 100 cooperatives, making adjustments to the annual budget in order to consider the typicality of their annual accounts. The results of the analysis show that suggested adjustments allow a more correct expression of the economic results and capital adequacy of the cooperative and that the results, expressed in terms of scoring, are higher than that achieved by a traditional analysis. This methodology could improve the credit access capacity for agricultural cooperatives and then reduce financial constraints, particularly in developing countries.
Weighted correlation network analysis (WGCNA applied to the tomato fruit metabolome.
Directory of Open Access Journals (Sweden)
Matthew V DiLeo
Full Text Available BACKGROUND: Advances in "omics" technologies have revolutionized the collection of biological data. A matching revolution in our understanding of biological systems, however, will only be realized when similar advances are made in informatic analysis of the resulting "big data." Here, we compare the capabilities of three conventional and novel statistical approaches to summarize and decipher the tomato metabolome. METHODOLOGY: Principal component analysis (PCA, batch learning self-organizing maps (BL-SOM and weighted gene co-expression network analysis (WGCNA were applied to a multivariate NMR dataset collected from developmentally staged tomato fruits belonging to several genotypes. While PCA and BL-SOM are appropriate and commonly used methods, WGCNA holds several advantages in the analysis of highly multivariate, complex data. CONCLUSIONS: PCA separated the two major genetic backgrounds (AC and NC, but provided little further information. Both BL-SOM and WGCNA clustered metabolites by expression, but WGCNA additionally defined "modules" of co-expressed metabolites explicitly and provided additional network statistics that described the systems properties of the tomato metabolic network. Our first application of WGCNA to tomato metabolomics data identified three major modules of metabolites that were associated with ripening-related traits and genetic background.
DEFF Research Database (Denmark)
Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus
2012-01-01
We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear......) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional...
Common reduced spaces of representation applied to multispectral texture analysis in cosmetology
Corvo, Joris; Angulo, Jesus; Breugnot, Josselin; Borbes, Sylvie; Closs, Brigitte
2016-03-01
Principal Component Analysis (PCA) is a technique of multivariate data analysis widely used in various fields like biology, ecology or economy to reduce data dimensionality while retaining most important information. It is becoming a standard practice in multispectral/hyperspectral imaging since those multivariate data generally suffer from a high redundancy level. Nevertheless, by definition, PCA is meant to be applied to a single multispectral/hyperspectral image at a time. When several images have to be treated, running a PCA on each image would generate specific reduced spaces, which is not suitable for comparison between results. Thus, we focus on two PCA based algorithms that could define common reduced spaces of representation. The first method arises from literature and is computed with the barycenter covariance matrix. On the contrary, we designed the second algorithm with the idea of correcting standard PCA using permutations and inversions of eigenvectors. These dimensionality reduction methods are used within the context of a cosmetological study of a foundation make-up. Available data are in-vivo multispectral images of skin acquired on different volunteers in time series. The main purpose of this study is to characterize the make-up degradation especially in terms of texture analysis. Results have to be validate by statistical prediction of time since applying the product. PCA algorithms produce eigenimages that separately enhance skin components (pores, radiance, vessels...). From these eigenimages, we extract morphological texture descriptors and intent a time prediction. Accuracy of common reduced spaces outperform classical PCA one. In this paper, we detail how PCA is extended to the multiple groups case and explain what are the advantages of common reduced spaces when it comes to study several multispectral images.
Energy Technology Data Exchange (ETDEWEB)
Naranjo, Alberto R.; Otero, Maria Elena M.; Poveda, Aylin G. [Higher Institute of Technologies and Applied Sciences, Havana City (Cuba)]. E-mails: rolo@instec.cu; mmontesi@instec.cu; Guerra, Alexeis C. [University of Informatic Sciences, Havana City (Cuba)]. E-mail: alexeis@uci.cu
2007-07-01
The application of non-linear dynamic methods in many scientific fields has demonstrated its great potentiality in the early detection of significant dynamic singularities. The introduction of these methods oriented to the surveillance of anomalies and failures of nuclear reactors and their fundamental equipment have been demonstrated in the last years. Specifically, Recurrence Plot and its Quantification Analysis are methods currently used in many scientific fields. The paper focuses its attention on the estimation of the Recurrence Plots and its Quantification Analysis applied to signal samples obtained from different types of reactors: research reactor TRIGA MARK-III, BWR/5 and PHWR. Different behaviors are compared in order to look for a pattern for the characterization of the power instability events in the nuclear reactor. These outputs have a great importance for its application in systems of surveillance and monitoring in Nuclear Power Plants. For its introduction in a real time monitoring system, the authors propose some useful approaches. The results indicate the potentiality of the method for its implementation in a system of surveillance and monitoring in Nuclear Power Plants. All the calculations were performed with two computational tools developed by Marwan: Cross Recurrence Plot Toolbox for Matlab (Version 5.7, Release 22) and Visual Recurrence Analysis (Version 4.8). (author)
Global Appearance Applied to Visual Map Building and Path Estimation Using Multiscale Analysis
Directory of Open Access Journals (Sweden)
Francisco Amorós
2014-01-01
Full Text Available In this work we present a topological map building and localization system for mobile robots based on global appearance of visual information. We include a comparison and analysis of global-appearance techniques applied to wide-angle scenes in retrieval tasks. Next, we define multiscale analysis, which permits improving the association between images and extracting topological distances. Then, a topological map-building algorithm is proposed. At first, the algorithm has information only of some isolated positions of the navigation area in the form of nodes. Each node is composed of a collection of images that covers the complete field of view from a certain position. The algorithm solves the node retrieval and estimates their spatial arrangement. With these aims, it uses the visual information captured along some routes that cover the navigation area. As a result, the algorithm builds a graph that reflects the distribution and adjacency relations between nodes (map. After the map building, we also propose a route path estimation system. This algorithm takes advantage of the multiscale analysis. The accuracy in the pose estimation is not reduced to the nodes locations but also to intermediate positions between them. The algorithms have been tested using two different databases captured in real indoor environments under dynamic conditions.
Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.
de Viguerie, Laurence; Sole, V Armando; Walter, Philippe
2009-12-01
X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.
Stelzenmüller, Vanessa; Maynou, Francesc; Ehrich, Siegfried; Zauke, Gerd-Peter
2004-09-01
This study aims to evaluate the suitability of non-linear geostatistics and indicator kriging (IK) as a tool in environmental impact assessment and nature conservation, in particular to search for potential Special Areas of Conservation (SAC) for the endangered fish species twaite shad, Alosa fallax (Lacepède, 1803) within the German Exclusive Economical Zone (EEZ) of the North Sea. To analyse the spatial distribution of this fish species, data on standardised biomass index (catch per unit effort, c.p.u.e., kg × 30 min-1) from 1996 to 2001 were used, regarding the third and fourth quarters of each year, respectively. Thereby we assume that the spatial distribution can be described as a time-invariant process. This assumption is supported by information on annual sampling effort, allocation of hauls and spatial distribution of the positive catches. All indicator variograms obtained for different c.p.u.e. cut-off values displayed distinct spatial structures, clearly indicating that the indicator variables were spatially autocorrelated. Gaussian models were fitted by least-squares methods and were evaluated with a goodness-of-fit statistic. Subsequently, IK was employed to estimate the probability of exceeding the c.p.u.e. cut-off values for the twaite shad in the investigation area. These were highest in the Weser- and Elbe-estuary, probably because of migrations of twaite shad to and from estuaries at the time of investigation due to spawning, while within the German EEZ of the North Sea no such areas with increased probabilities could be discerned. Thus, although available data did not allow to identify and implement any SAC in the German EEZ, the methods employed here can be regarded as a promising management tool in biological conservation issues. (
Lucier, Amie Marie
The role of geomechanical analysis in characterizing the feasibility of CO2 sequestration in deep saline aquifers is addressed in two investigations. The first investigation was completed as part of the Ohio River Valley CO2 Storage Project. We completed a geomechanical analysis of the Rose Run Sandstone, a potential injection zone, and its adjacent formations at the American Electric Power's 1.3 GW Mountaineer Power Plant in New Haven, West Virginia. The results of this analysis were then used to evaluate the feasibility of anthropogenic CO2 sequestration in the potential injection zone. First, we incorporated the results of the geomechanical analysis with a geostatistical aquifer model in CO2 injection flow simulations to test the effects of introducing a hydraulic fracture to increase injectivity. Then, we determined that horizontal injection wells at the Mountaineer site are feasible because the high rock strength ensures that such wells would be stable in the local stress state. Finally, we evaluated the potential for injection-induced seismicity. The second investigation concerning CO2 sequestration was motivated by the modeling and fluid flow simulation results from the first study. The geomechanics-based assessment workflow follows a bottom-up approach for evaluating regional deep saline aquifer CO2 injection and storage feasibility. The CO2 storage capacity of an aquifer is a function of its porous volume as well as its CO2 injectivity. For a saline aquifer to be considered feasible in this assessment it must be able to store a specified amount of CO2 at a reasonable cost per ton of CO 2. The proposed assessment workflow has seven steps. The workflow was applied to a case study of the Rose Run sandstone in the eastern Ohio River Valley. We found that it is feasible in this region to inject and store 113 Mt CO2/yr for 30 years at an associated well cost of less than 1.31 US$/t CO2, but only if injectivity enhancement techniques such as hydraulic fracturing
The Process of Laying Concrete and Analysis of Operations Applying the Lean Approach
Directory of Open Access Journals (Sweden)
Vidmantas Gramauskas
2012-11-01
Full Text Available The paper considers Lean philosophy ‘Just in Time’, a value stream map and total quality management principles applying them to the construction. In order to follow these principles, a case study was performed, thus observing and recording the process of laying concrete in three houses where a lower ground floor was casted employing fiber concrete. The collected data were required to fragment the process of laying concrete into smaller operations and examine each operation independently. The examination of operations was introduced in certain units of measurement – time, the number of workers, cubic meters of concrete used, space area, etc. This information helped with distinguishing useful operations from useless actions bringing no value to the product. The previously mentioned methods can be applied to useless operations to reduce their duration or even eliminate them. The main problem is the process of laying concrete splitting it into smaller operations, operation analysis and adaptation of Lean principles. The implementation of Lean system can reduce waste and increase the value of the final product.
Król, Małgorzata; Karoly, Agnes; Kościelniak, Paweł
2014-09-01
Forensic laboratories are increasingly engaged in the examination of fraudulent documents, and what is important, in many cases these are inkjet-printed documents. That is why systematic approaches to inkjet printer inks comparison and identification have been carried out by both non-destructive and destructive methods. In this study, micro-Raman spectroscopy and capillary electrophoresis (CE) were applied to the analysis of colour inkjet printer inks. Micro-Raman spectroscopy was used to study the chemical composition of colour inks in situ on a paper surface. It helps to characterize and differentiate inkjet inks, and can be used to create a spectra database of inks taken from different cartridge brands and cartridge numbers. Capillary electrophoresis in micellar electrophoretic capillary chromatography mode was applied to separate colour and colourless components of inks, enabling group identification of those components which occur in a sufficient concentration (giving intensive peaks). Finally, on the basis of the obtained results, differentiation of the analysed inks was performed. Twenty-three samples of inkjet printer inks were examined and the discriminating power (DP) values for both presented methods were established in the routine work of experts during the result interpretation step. DP was found to be 94.0% (Raman) and 95.6% (CE) when all the analysed ink samples were taken into account, and it was 96.7% (Raman) and 98.4% (CE), when only cartridges with different index numbers were considered.
Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey
2015-10-10
Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients.
Directory of Open Access Journals (Sweden)
Biook Behnam
2014-09-01
Full Text Available In recent years, genre studies have attracted the attention of many researchers. The aim of the present study was to observe the differences in generic structure of abstract written by English native and non-native (Iranian students in two disciplines of mathematics and applied linguistics. To this end, twenty native English students’ abstract texts from each discipline and the same number of non-native (Iranian ones were selected. In this study, Hyland’s (2000 five‐move model was used to identify the rhetorical structure of the four sets of texts. After analyzing each text, the main moves were extracted and the frequencies of each one were calculated and compared. The cross-disciplinary and cross‐linguistic analyses reveal that linguistics abstracts follow a conventional scheme, but mathematics abstracts in these two languages do not exhibit the usual norms in terms of moves. Besides, greater difference in move structure is seen across languages in mathematics. The findings of the study have some pedagogical implications for academic writing courses for graduate students, especially students from non-English backgrounds in order to facilitate their successful acculturation into these disciplinary communities. Keywords: Genre Analysis, mathematics, applied linguistics
[Risk Analysis applied to food safety in Brazil: prospects and challenges].
Figueiredo, Ana Virgínia de Almeida; Miranda, Maria Spínola
2011-04-01
The scope of this case study is to discuss the ideas of the Brazilian Codex Alimentarius Committee (CCAB) coordinated by National Institute of Metrology, Standardization and Industrial Quality (Inmetro), with respect to the Codex Alimentarius norm on Risk Analysis (RA) applied to Food Safety. The objectives of this investigation were to identify and analyze the opinion of CCAB members on RA and to register their proposals for the application of this norm in Brazil, highlighting the local limitations and potential detected. CCAB members were found to be in favor of the Codex Alimentarius initiative of instituting an RA norm to promote the health safety of foods that circulate on the international market. There was a consensus that the Brazilian government should incorporate RA as official policy to improve the country's system of food control and leverage Brazilian food exports. They acknowledge that Brazil has the technical-scientific capacity to apply this norm, though they stressed several political and institutional limitations. The members consider RA to be a valid initiative for tackling risks in food, due to its ability to improve food safety control measures adopted by the government.
Directory of Open Access Journals (Sweden)
Wensheng Dai
2014-01-01
Full Text Available Sales forecasting is one of the most important issues in managing information technology (IT chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR, is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA, temporal ICA (tICA, and spatiotemporal ICA (stICA to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie
2014-01-01
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Thuring, Ann; Brännström, K Jonas; Jansson, Tomas; Maršál, Karel
2014-12-01
Analysis of umbilical artery flow velocity waveforms characterized by pulsatility index (PI) is used to evaluate fetoplacental circulation in high-risk pregnancies. However, an experienced sonographer may be able to further differentiate between various timbres of Doppler audio signals. Recently, we have developed a method for objective audio signal characterization; the method has been tested in an animal model. In the present pilot study, the method was for the first time applied to human pregnancies. Doppler umbilical artery velocimetry was performed in 13 preterm fetuses before and after two doses of 12 mg betamethasone. The auditory measure defined by the frequency band where the spectral energy had dropped 15 dB from its maximum level (MAXpeak-15 dB ), increased two days after betamethasone administration (p = 0.001) parallel with a less pronounced decrease in PI (p = 0.04). The new auditory parameter MAXpeak-15 dB reflected the changes more sensitively than the PI did.
CONTROL AND STABILITY ANALYSIS OF THE GMC ALGORITHM APPLIED TO pH SYSTEMS
Directory of Open Access Journals (Sweden)
Manzi J.T.
1998-01-01
Full Text Available This paper deals with the control of the neutralization processes of the strong acid-strong base and the weak acid-strong base systems using the Generic Model Control (GMC algorithm. The control strategy is applied to a pilot plant where hydrochloric acid-sodium hydroxide and acetic acid-sodium hydroxide systems are neutralized. The GMC algorithm includes in the controller structure a nonlinear model of the process in the controller structure. The paper also focuses the provides a stability analysis of the controller for some of the uncertainties involved in the system. The rResults indicate that the controller stabilizes the system for a large range of uncertainties, but the performance may deteriorate when the system is submitted to large disturbances.
A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics
DEFF Research Database (Denmark)
Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;
2015-01-01
exercise, thereby bypassing the challenging task of model structure determination and identification. Parameter identification problems can thus lead to ill-calibrated models with low predictive power and large model uncertainty. Every calibration exercise should therefore be precededby a proper model...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...
Matsuo, Miyuki; Yokoyama, Misao; Umemura, Kenji; Gril, Joseph; Yano, Ken'ichiro; Kawai, Shuichi
2010-04-01
This paper deals with the kinetics of the color properties of hinoki ( Chamaecyparis obtusa Endl.) wood. Specimens cut from the wood were heated at 90-180°C as accelerated aging treatment. The specimens completely dried and heated in the presence of oxygen allowed us to evaluate the effects of thermal oxidation on wood color change. Color properties measured by a spectrophotometer showed similar behavior irrespective of the treatment temperature with each time scale. Kinetic analysis using the time-temperature superposition principle, which uses the whole data set, was successfully applied to the color changes. The calculated values of the apparent activation energy in terms of L *, a *, b *, and Δ E^{*}_{ab} were 117, 95, 114, and 113 kJ/mol, respectively, which are similar to the values of the literature obtained for other properties such as the physical and mechanical properties of wood.
Afeyan, Bedros; Starck, Jean Luc; Cuneo, Michael
2012-01-01
We introduce wavelets, curvelets and multiresolution analysis techniques to assess the symmetry of X ray driven imploding shells in ICF targets. After denoising X ray backlighting produced images, we determine the Shell Thickness Averaged Radius (STAR) of maximum density, r*(N, {\\theta}), where N is the percentage of the shell thickness over which to average. The non-uniformities of r*(N, {\\theta}) are quantified by a Legendre polynomial decomposition in angle, {\\theta}. Undecimated wavelet decompositions outperform decimated ones in denoising and both are surpassed by the curvelet transform. In each case, hard thresholding based on noise modeling is used. We have also applied combined wavelet and curvelet filter techniques with variational minimization as a way to select the significant coefficients. Gains are minimal over curvelets alone in the images we have analyzed.
Time Series Analysis Methods Applied to the Super-Kamiokande I Data
Ranucci, G
2005-01-01
The need to unravel modulations hidden in noisy time series of experimental data is a well known problem, traditionally attacked through a variety of methods, among which a popular tool is the so called Lomb-Scargle periodogram. Recently, for a class of problems in the solar neutrino field, it has been proposed an alternative maximum likelihood based approach, intended to overcome some intrinsic limitations affecting the Lomb-Scargle implementation. This work is focused to highlight the features of the likelihood methodology, introducing in particular an analytical approach to assess the quantitative significance of the potential modulation signals. As an example, the proposed method is applied to the time series of the measured values of the 8B neutrino flux released by the Super-Kamiokande collaboration, and the results compared with those of previous analysis performed on the same data sets. In appendix, for completeness, it is also examined in detail the relationship between the Lomb-Scargle and the likel...
Lasmar, O; Zanetti, R; dos Santos, A; Fernandes, B V
2012-08-01
One of the fundamental steps in pest sampling is the assessment of the population distribution in the field. Several studies have investigated the distribution and appropriate sampling methods for leaf-cutting ants; however, more reliable methods are still required, such as those that use geostatistics. The objective of this study was to determine the spatial distribution and infestation rate of leaf-cutting ant nests in eucalyptus plantations by using geostatistics. The study was carried out in 2008 in two eucalyptus stands in Paraopeba, Minas Gerais, Brazil. All of the nests in the studied area were located and used for the generation of GIS maps, and the spatial pattern of distribution was determined considering the number and size of nests. Each analysis and map was made using the R statistics program and the geoR package. The nest spatial distribution in a savanna area of Minas Gerais was clustered to a certain extent. The models generated allowed the production of kriging maps of areas infested with leaf-cutting ants, where chemical intervention would be necessary, reducing the control costs, impact on humans, and the environment.
Messier, Kyle P.; Akita, Yasuyuki; Serre, Marc L.
2012-01-01
Geographic Information Systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques t...
Challenges in the implementation of a quality management system applied to radiometric analysis
Energy Technology Data Exchange (ETDEWEB)
Dias, Danila C.S.; Bonifacio, Rodrigo L.; Nascimento, Marcos R.L.; Silva, Nivaldo C. da; Taddei, Maria Helena T., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN-MG), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas
2015-07-01
The concept of quality in laboratories has been well established as an essential factor in the search for reliable results. Since its first version published (1999), the ISO/IEC 17025 has been applied in the industrial and research fields, in a wide range of laboratorial analyses. However, the implementation of a Quality Management System still poses great challenges to institutions and companies. The purpose of this work is to expose the constraints related to the implementation of ISO/IEC 17025 applied to analytical assays of radionuclides, accomplished by studying the case of the Pocos de Caldas Laboratory of the Brazilian Commission for Nuclear Energy. In this lab, a project of accreditation of techniques involving determination of radionuclides in water, soil, sediment and food samples has been conducted since 2011. The challenges presented by this project arise from the administrative view, where the governmental nature of the institution translates into unlevelled availability resources and the organizational view, whereas QMS requires inevitable changes in the organizational culture. It is important to point out that when it comes to accreditation of analysis involving radioactive elements, many aspects must be treated carefully due to the their very particular nature. Among these concerns are the determination of analysis uncertainties, accessibility to international proficiency studies, international radioactive samples and CRM transportation, the study of parameters on the validation of analytical methods and the lack of documentation and specialized personnel regarding quality at radiometric measurements. Through an effective management system, the institution is overcoming these challenges, moving toward the ISO/IEC 17025 accreditation. (author)
Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.
2015-12-01
This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical
Energy Technology Data Exchange (ETDEWEB)
Overbey, W.K. Jr.; Reeves, T.K.; Salamy, S.P.; Locke, C.D.; Johnson, H.R.; Brunk, R.; Hawkins, L. (BDM Engineering Services Co., Morgantown, WV (United States))
1991-05-01
This research program has been designed to develop and verify a unique geostatistical approach for finding natural gas resources. The research has been conducted by Beckley College, Inc. (Beckley) and BDM Engineering Services Company (BDMESC) under contract to the US Department of Energy (DOE), Morgantown Energy Technology Center. Phase 1 of the project consisted of compiling and analyzing relevant geological and gas production information in selected areas of Raleigh County, West Virginia, ultimately narrowed to the Eccles, West Virginia, 7 {1/2} minute Quadrangle. The Phase 1 analysis identified key parameters contributing to the accumulation and production of natural gas in Raleigh County, developed analog models relating geological factors to gas production, and identified specific sites to test and verify the analysis methodologies by drilling. Based on the Phase 1 analysis, five sites have been identified with high potential for economic gas production. Phase 2 will consist of drilling, completing, and producing one or more wells at the sites identified in the Phase 1 analyses. The initial well is schedules to the drilled in April 1991. This report summarizes the results of the Phase 1 investigations. For clarity, the report has been prepared in two volumes. Volume 1 presents the Phase 1 overview; Volume 2 contains the detailed geological and production information collected and analyzed for this study.
Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.
2016-04-01
Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the
Directory of Open Access Journals (Sweden)
Alexandra Ziemann
Full Text Available Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors.We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events.We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness.We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.
2010-08-01
Validation of the Geostatistical Temporal-Spatial Algorithm (GTS) for Optimization of Long-Term Monitoring (LTM) of Groundwater at Military and... Geostatistical Temporal-Spatial Algorithm (GTS) for Optimization of Long-Term Monitoring (LTM) of Groundwater at Military and Government Sites 5a. CONTRACT NUMBER...ABSTRACT The primary objective of this ESTCP project was to demonstrate and validate use of the Geostatistical Temporal-Spatial (GTS) groundwater
Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.
1998-01-01
The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.
A geostatistical methodology to assess the accuracy of unsaturated flow models
Energy Technology Data Exchange (ETDEWEB)
Smoot, J.L.; Williams, R.E.
1996-04-01
The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.
Jha, Sanjeev Kumar
2015-07-21
A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here, the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 km and 10 km resolution for a twenty year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference dataset indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local scale estimates of precipitation and temperature from General Circulation Models. This article is protected by copyright. All rights reserved.
Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation
Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.
2012-12-01
The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T cloud formation. Initial results applying single particle IN analysis using CCSEM/EDX and STXM/NEXAFS reveal that a significant amount of IN are coated by organics and, thus, are similar to the
Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich
2015-01-01
Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.
Bossew, P; Žunić, Z S; Stojanovska, Z; Tollefsen, T; Carpentieri, C; Veselinović, N; Komatina, S; Vaupotič, J; Simović, R D; Antignani, S; Bochicchio, F
2014-01-01
Between 2008 and 2011 a survey of radon ((222)Rn) was performed in schools of several districts of Southern Serbia. Some results have been published previously (Žunić et al., 2010; Carpentieri et al., 2011; Žunić et al., 2013). This article concentrates on the geographical distribution of the measured Rn concentrations. Applying geostatistical methods we generate "school radon maps" of expected concentrations and of estimated probabilities that a concentration threshold is exceeded. The resulting maps show a clearly structured spatial pattern which appears related to the geological background. In particular in areas with vulcanite and granitoid rocks, elevated radon (Rn) concentrations can be expected. The "school radon map" can therefore be considered as proxy to a map of the geogenic radon potential, and allows identification of radon-prone areas, i.e. areas in which higher Rn radon concentrations can be expected for natural reasons. It must be stressed that the "radon hazard", or potential risk, estimated this way, has to be distinguished from the actual radon risk, which is a function of exposure. This in turn may require (depending on the target variable which is supposed to measure risk) considering demographic and sociological reality, i.e. population density, distribution of building styles and living habits.
Jahjah, Munzer; Ulivieri, Carlo
2004-02-01
Understanding the dynamics of land cover change has increasingly been recognized as one of the key research imperatives in global environmental change research. Scientists have developed and applied various methods in order to find and propose solutions for many environmental world problems. From 1986-1995 changes in Kenya coastal zone landcover, derived from the post-classification TM images, were significant with arid areas growing from 3% to 10%, woody areas decreased from 4% to 2%, herbaceous areas decreased from 25% to 20%, developed land increased from 2% to 3%. In order to generate the change probability map as a continuous surface using geostatistical method-ArcGIS, we used as an input the Generalized Linear Model (GLM) probability result. The results reveal the efficiency of the Probability-of-Change map (POC), especially if reference data are lacking, in indicating the possibility of having a change and its type in a determined area, taking advantage of the layer transparency of the GIS systems. Thus, the derived information supplies a good tool for the interpretation of the magnitude of the land cover changes and guides the final user directly to the areas of changes to understand and derive the possible interactions of human or natural processes.
DEFF Research Database (Denmark)
He, Xin; Sonnenborg, Torben; Jørgensen, F.
2014-01-01
Multiple-point geostatistical simulation (MPS) has recently become popular in stochastic hydrogeology, primarily because of its capability to derive multivariate distributions from a training image (TI). However, its application in three-dimensional (3-D) simulations has been constrained by the d...... is a convenient and efficient way of integrating secondary data such as 3-D airborne electromagnetic data (SkyTEM), but over-conditioning has to be avoided....... by the difficulty of constructing a 3-D TI. The object-based unconditional simulation program TiGenerator may be a useful tool in this regard; yet the applicability of such parametric training images has not been documented in detail. Another issue in MPS is the integration of multiple geophysical data. The proper...... way to retrieve and incorporate information from high-resolution geophysical data is still under discussion. In this study, MPS simulation was applied to different scenarios regarding the TI and soft conditioning. By comparing their output from simulations of groundwater flow and probabilistic capture...
Latour, Robert A
2015-03-01
The Langmuir adsorption isotherm provides one of the simplest and most direct methods to quantify an adsorption process. Because isotherm data from protein adsorption studies often appear to be fit well by the Langmuir isotherm model, estimates of protein binding affinity have often been made from its use despite that fact that none of the conditions required for a Langmuir adsorption process may be satisfied for this type of application. The physical events that cause protein adsorption isotherms to often provide a Langmuir-shaped isotherm can be explained as being due to changes in adsorption-induced spreading, reorientation, clustering, and aggregation of the protein on a surface as a function of solution concentration in contrast to being due to a dynamic equilibrium adsorption process, which is required for Langmuir adsorption. Unless the requirements of the Langmuir adsorption process can be confirmed, fitting of the Langmuir model to protein adsorption isotherm data to obtain thermodynamic properties, such as the equilibrium constant for adsorption and adsorption free energy, may provide erroneous values that have little to do with the actual protein adsorption process, and should be avoided. In this article, a detailed analysis of the Langmuir isotherm model is presented along with a quantitative analysis of the level of error that can arise in derived parameters when the Langmuir isotherm is inappropriately applied to characterize a protein adsorption process.
Selection of clinical features for pattern recognition applied to gait analysis.
Altilio, Rosa; Paoloni, Marco; Panella, Massimo
2017-04-01
This paper deals with the opportunity of extracting useful information from medical data retrieved directly from a stereophotogrammetric system applied to gait analysis. A feature selection method to exhaustively evaluate all the possible combinations of the gait parameters is presented, in order to find the best subset able to classify among diseased and healthy subjects. This procedure will be used for estimating the performance of widely used classification algorithms, whose performance has been ascertained in many real-world problems with respect to well-known classification benchmarks, both in terms of number of selected features and classification accuracy. Precisely, support vector machine, Naive Bayes and K nearest neighbor classifiers can obtain the lowest classification error, with an accuracy greater than 97 %. For the considered classification problem, the whole set of features will be proved to be redundant and it can be significantly pruned. Namely, groups of 3 or 5 features only are able to preserve high accuracy when the aim is to check the anomaly of a gait. The step length and the swing speed are the most informative features for the gait analysis, but also cadence and stride may add useful information for the movement evaluation.
Bayesian flux balance analysis applied to a skeletal muscle metabolic model.
Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki
2007-09-01
In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models.
Applying a resources framework to analysis of the Force and Motion Conceptual Evaluation
Smith, Trevor I.; Wittmann, Michael C.
2008-12-01
We suggest one redefinition of common clusters of questions used to analyze student responses on the Force and Motion Conceptual Evaluation. Our goal is to propose a methodology that moves beyond an analysis of student learning defined by correct responses, either on the overall test or on clusters of questions defined solely by content. We use the resources framework theory of learning to define clusters within this experimental test that was designed without the resources framework in mind. We take special note of the contextual and representational dependence of questions with seemingly similar physics content. We analyze clusters in ways that allow the most common incorrect answers to give as much, or more, information as the correctness of responses in that cluster. We show that false positives can be found, especially on questions dealing with Newton’s third law. We apply our clustering to a small set of data to illustrate the value of comparing students’ incorrect responses which are otherwise identical on a correct or incorrect analysis. Our work provides a connection between theory and experiment in the area of survey design and the resources framework.
Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.
Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B
2008-07-01
Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.
Institute of Scientific and Technical Information of China (English)
WANG Zheng; ZHU Dianxiang
2007-01-01
This paper took the upper-lower wide belt sander B229 with four-feet wide belts,manufactured in China,as the study target.By means of framework dynamic design,we study its vibration characteristics by commencing from the place having horizontal defects and used experimental modal analysis (EMA) and power spectrum density (PSD) to observe the sanding parts and the whole machine,respectively.In the modal test,we mainly adopted the cross spots testing method to get the frequency response function of the fixed spots to every excitation vibration spot,then applied the SISO frequency response function and the frequency response function fitting method to identify and complete parameter recognition,respectively.The typical frequency response function chart of the whole machine and its sanding parts,as well as its second-order mode charts of contacting roller,were obtained.Through PSD analysis,we can get the amplitude-frequency spectrum and drive frequency.
Analysis of Phoenix Anomalies and IV & V Findings Applied to the GRAIL Mission
Larson, Steve
2012-01-01
NASA IV&V was established in 1993 to improve safety and cost-effectiveness of mission critical software. Since its inception the tools and strategies employed by IV&V have evolved. This paper examines how lessons learned from the Phoenix project were developed and applied to the GRAIL project. Shortly after selection, the GRAIL project initiated a review of the issues documented by IV&V for Phoenix. The motivation was twofold: the learn as much as possible about the types of issues that arose from the flight software product line slated for use on GRAIL, and to identify opportunities for improving the effectiveness of IV&V on GRAIL. The IV&V Facility provided a database dump containing 893 issues. These were categorized into 16 bins, and then analyzed according to whether the project responded by changing the affected artifacts or using as-is. The results of this analysis were compared to a similar assessment of post-launch anomalies documented by the project. Results of the analysis were discussed with the IV&V team assigned to GRAIL. These discussions led to changes in the way both the project and IV&V approached the IV&V task, and improved the efficiency of the activity.
Non-Linear Non Stationary Analysis of Two-Dimensional Time-Series Applied to GRACE Data Project
National Aeronautics and Space Administration — The proposed innovative two-dimensional (2D) empirical mode decomposition (EMD) analysis was applied to NASA's Gravity Recovery and Climate Experiment (GRACE)...
Directory of Open Access Journals (Sweden)
Derric B. Jacobs
2017-03-01
Full Text Available Resilient communities promote trust, have well-developed networks, and can adapt to change. For rural communities in fire-prone landscapes, current resilience strategies may prove insufficient in light of increasing wildfire risks due to climate change. It is argued that, given the complexity of climate change, adaptations are best addressed at local levels where specific social, cultural, political, and economic conditions are matched with local risks and opportunities. Despite the importance of social networks as key attributes of community resilience, research using social network analysis on coupled human and natural systems is scarce. Furthermore, the extent to which local communities in fire-prone areas understand climate change risks, accept the likelihood of potential changes, and have the capacity to develop collaborative mitigation strategies is underexamined, yet these factors are imperative to community resiliency. We apply a social network framework to examine information networks that affect perceptions of wildfire and climate change in Central Oregon. Data were collected using a mailed questionnaire. Analysis focused on the residents' information networks that are used to gain awareness of governmental activities and measures of community social capital. A two-mode network analysis was used to uncover information exchanges. Results suggest that the general public develops perceptions about climate change based on complex social and cultural systems rather than as patrons of scientific inquiry and understanding. It appears that perceptions about climate change itself may not be the limiting factor in these communities' adaptive capacity, but rather how they perceive local risks. We provide a novel methodological approach in understanding rural community adaptation and resilience in fire-prone landscapes and offer a framework for future studies.
Duarte, F; Calvo, M V; Borges, A; Scatoni, I B
2015-08-01
The oriental fruit moth, Grapholita molesta (Busck), is the most serious pest in peach, and several insecticide applications are required to reduce crop damage to acceptable levels. Geostatistics and Geographic Information Systems (GIS) are employed to measure the range of spatial correlation of G. molesta in order to define the optimum sampling distance for performing spatial analysis and to determine the current distribution of the pest in peach orchards of southern Uruguay. From 2007 to 2010, 135 pheromone traps per season were installed and georeferenced in peach orchards distributed over 50,000 ha. Male adult captures were recorded weekly from September to April. Structural analysis of the captures was performed, yielding 14 semivariograms for the accumulated captures analyzed by generation and growing season. Two sets of maps were constructed to describe the pest distribution. Nine significant models were obtained in the 14 evaluated periods. The range estimated for the correlation was from 908 to 6884 m. Three hot spots of high population level and some areas with comparatively low populations were constant over the 3-year period, while there is a greater variation in the size of the population in different generations and years in other areas.
Karagiannis-Voules, Dimitrios-Alexios; Odermatt, Peter; Biedermann, Patricia; Khieu, Virak; Schär, Fabian; Muth, Sinuon; Utzinger, Jürg; Vounatsou, Penelope
2015-01-01
Soil-transmitted helminth infections are intimately connected with poverty. Yet, there is a paucity of using socioeconomic proxies in spatially explicit risk profiling. We compiled household-level socioeconomic data pertaining to sanitation, drinking-water, education and nutrition from readily available Demographic and Health Surveys, Multiple Indicator Cluster Surveys and World Health Surveys for Cambodia and aggregated the data at village level. We conducted a systematic review to identify parasitological surveys and made every effort possible to extract, georeference and upload the data in the open source Global Neglected Tropical Diseases database. Bayesian geostatistical models were employed to spatially align the village-aggregated socioeconomic predictors with the soil-transmitted helminth infection data. The risk of soil-transmitted helminth infection was predicted at a grid of 1×1km covering Cambodia. Additionally, two separate individual-level spatial analyses were carried out, for Takeo and Preah Vihear provinces, to assess and quantify the association between soil-transmitted helminth infection and socioeconomic indicators at an individual level. Overall, we obtained socioeconomic proxies from 1624 locations across the country. Surveys focussing on soil-transmitted helminth infections were extracted from 16 sources reporting data from 238 unique locations. We found that the risk of soil-transmitted helminth infection from 2000 onwards was considerably lower than in surveys conducted earlier. Population-adjusted prevalences for school-aged children from 2000 onwards were 28.7% for hookworm, 1.5% for Ascaris lumbricoides and 0.9% for Trichuris trichiura. Surprisingly, at the country-wide analyses, we did not find any significant association between soil-transmitted helminth infection and village-aggregated socioeconomic proxies. Based also on the individual-level analyses we conclude that socioeconomic proxies might not be good predictors at an
Directory of Open Access Journals (Sweden)
Daniela Gonçalves Rando
2010-06-01
Full Text Available Leishmaniasis is an important health and social problem for which there is limited effective therapy. Chalcones and N-acylhydrazones have been studied as promising antileishmanial agents in enzymatic inhibition and in vitro assays. Since these chemical classes of compounds also resemble each other structurally, it would be useful to investigate whether they share direct analogy. Exploratory data analysis was applied to a library of chalcones and nitrated N-acylhydrazones assayed against Leishmania donovani to investigate their similarity. Under the conditions applied in the present study, the two classes did not present functional or structural analogy.As leishmanioses são importantes problemas sociais e de saúde pública para os quais a terapia farmacológica atual é, ainda, limitada. Chalconas e N-acilidrazonas têm sido estudadas como promissores agentes leishmanicidas tanto em ensaios in vitro quanto em ensaios de inibição de cisteíno-proteases importantes para o parasito. Uma vez que estas classes de compostos apresentam similaridade bidimensional, seria interessante estudar se estes compostos guardariam relação de analogia direta entre si. Análise exploratória de dados foi aplicada, então, à biblioteca de chalconas e N-acilidrazonas nitradas ensaiadas contra Leishmania donovani para investigar suas relações de similaridade. Os resultados mostraram que, ao menos sob as condições consideradas neste estudo, as duas classes de compostos não apresentam analogia estrutural e funcional simultaneamente, embora elas apresentem alguma similaridade estrutural.
Abe, Kazuhiro; Takahashi, Toshimitsu; Takikawa, Yoriko; Arai, Hajime; Kitazawa, Shigeru
2011-10-01
Independent component analysis (ICA) can be usefully applied to functional imaging studies to evaluate the spatial extent and temporal profile of task-related brain activity. It requires no a priori assumptions about the anatomical areas that are activated or the temporal profile of the activity. We applied spatial ICA to detect a voluntary but hidden response of silent speech. To validate the method against a standard model-based approach, we used the silent speech of a tongue twister as a 'Yes' response to single questions that were delivered at given times. In the first task, we attempted to estimate one number that was chosen by a participant from 10 possibilities. In the second task, we increased the possibilities to 1000. In both tasks, spatial ICA was as effective as the model-based method for determining the number in the subject's mind (80-90% correct per digit), but spatial ICA outperformed the model-based method in terms of time, especially in the 1000-possibility task. In the model-based method, calculation time increased by 30-fold, to 15 h, because of the necessity of testing 1000 possibilities. In contrast, the calculation time for spatial ICA remained as short as 30 min. In addition, spatial ICA detected an unexpected response that occurred by mistake. This advantage was validated in a third task, with 13 500 possibilities, in which participants had the freedom to choose when to make one of four responses. We conclude that spatial ICA is effective for detecting the onset of silent speech, especially when it occurs unexpectedly.
Bayesian Information-Gap Decision Analysis Applied to a CO2 Leakage Problem
O'Malley, D.; Vesselinov, V. V.
2014-12-01
We describe a decision analysis in the presence of uncertainty that combines a non-probabilistic approach (information-gap decision theory) with a probabilistic approach (Bayes' theorem). Bayes' theorem is one of the most popular techniques for probabilistic uncertainty quantification (UQ). It is effective in many situations, because it updates our understanding of the uncertainties by conditioning on real data using a mathematically rigorous technique. However, the application of Bayes' theorem in science and engineering is not always rigorous. There are two reasons for this: (1) We can enumerate the possible outcomes of dice-rolling, but not the possible outcomes of real-world contamination remediation; (2) We can precisely determine conditional probabilities for coin-tossing, but substantial uncertainty surrounds the conditional probabilities for real-world contamination remediation. Of course, Bayes' theorem is rigorously applicable beyond dice-rolling and coin-tossing, but even in cases that are constructed to be simple with ostensibly good probabilistic models, applying Bayes' theorem to the real world may not work as well as one might expect. Bayes' theorem is rigorously applicable only if all possible events can be described, and their conditional probabilities can be derived rigorously. Outside of this domain, it may still be useful, but its use lacks at least some rigor. The information-gap approach allows us to circumvent some of the highlighted shortcomings of Bayes' theorem. In particular, it provides a way to account for possibilities beyond those described by our models, and a way to deal with uncertainty in the conditional distribution that forms the core of Bayesian analysis. We have developed a three-tiered technique enables one to make scientifically defensible decisions in the face of severe uncertainty such as is found in many geologic problems. To demonstrate the applicability, we apply the technique to a CO2 leakage problem. The goal is to
Institute of Scientific and Technical Information of China (English)
J.MOHAMMADI; M.H.MOTAGHIAN
2011-01-01
The association of organic carbon with secondary parzicles (aggregates) results in its storage and retention in soil. A study was carried out at a catchment covering about 92 km2 to predict spatial variability of soil water-stable aggregates (WSA), mean weight diameter (MWD) of aggregates and organic carbon (OC) content in macro- (＞ 2 mm), meso- (1-2 mm), and micro-aggregate (＜ 1 mm) fractions, using geostatistical methods. One hundred and eleven soil samples were c(o)llected at the 0-10 cm depth and fractionated into macro-, meso-, and micro-aggregates by wet sieving. The OC content was determined for each fraction. A greater percentage of water-stable aggregates was found for micro-aggregates, followed by meso-aggregates. Aggregate OC content was greatest in meso-aggregates (9 g kg-1), followed by micro-aggregates (7 g kg-1), while the least OC content was found in macro-aggregates (3 g kg-1). Although a significart effect (P = 0.000) of aggregate size on aggregate OC content was found, however, our findings did not support the model of aggregate hierarchy.Land use had a significant effect (P = 0.073) on aggregate OC content. The coefficients of variation (CVs) for OC contents associated with each aggregate fraction indicated macro-aggregates as the most variable (CV = 71％). Among the aggregate fractions, the micro-aggregate fraction had a lower CV value of 27％. The mean content of WSA ranged from 15％ for macro-aggregates to 84％ for micro-aggregates. Geostatistical analysis showed that the measured soil variables exhibited differences in their spatial patterns in both magnitude and space at each aggregate size fraction. The relative nugget variance for most aggregate-associated properties was lower than 45％. The range value for the variogram of water-stable aggregates was almost similar (about 3 km) for the three studied aggregate size classes. The range value for the variogram of aggregate-associated OC contents ranged from about 3 km for macro
Emadi, Mostafa; Baghernejad, Majid; Pakparvar, Mojtaba; Kowsar, Sayyed Ahang
2010-05-01
This study was undertaken to incorporate geostatistics, remote sensing, and geographic information system (GIS) technologies to improve the qualitative land suitability assessment in arid and semiarid ecosystems of Arsanjan plain, southern Iran. The primary data were obtained from 85 soil samples collected from tree depths (0-30, 30-60, and 60-90 cm); the secondary information was acquired from the remotely sensed data from the linear imaging self-scanner (LISS-III) receiver of the IRS-P6 satellite. Ordinary kriging and simple kriging with varying local means (SKVLM) methods were used to identify the spatial dependency of soil important parameters. It was observed that using the data collected from the spectral values of band 1 of the LISS-III receiver as the secondary variable applying the SKVLM method resulted in the lowest mean square error for mapping the pH and electrical conductivity (ECe) in the 0-30-cm depth. On the other hand, the ordinary kriging method resulted in a reliable accuracy for the other soil properties with moderate to strong spatial dependency in the study area for interpolation in the unstamped points. The parametric land suitability evaluation method was applied on the density points (150 x 150 m(2)) instead of applying on the limited representative profiles conventionally, which were obtained by the kriging or SKVLM methods. Overlaying the information layers of the data was used with the GIS for preparing the final land suitability evaluation. Therefore, changes in land characteristics could be identified in the same soil uniform mapping units over a very short distance. In general, this new method can easily present the squares and limitation factors of the different land suitability classes with considerable accuracy in arbitrary land indices.
Namysłowska-Wilczyńska, Barbara
2016-04-01
. These data were subjected to spatial analyses using statistical and geostatistical methods. The evaluation of basic statistics of the investigated quality parameters, including their histograms of distributions, scatter diagrams between these parameters and also correlation coefficients r were presented in this article. The directional semivariogram function and the ordinary (block) kriging procedure were used to build the 3D geostatistical model. The geostatistical parameters of the theoretical models of directional semivariograms of the studied water quality parameters, calculated along the time interval and along the wells depth (taking into account the terrain elevation), were used in the ordinary (block) kriging estimation. The obtained results of estimation, i.e. block diagrams allowed to determine the levels of increased values Z* of studied underground water quality parameters. Analysis of the variability in the selected quality parameters of underground water for an analyzed area in Klodzko water intake was enriched by referring to the results of geostatistical studies carried out for underground water quality parameters and also for a treated water and in Klodzko water supply system (iron Fe, manganese Mn, ammonium ion NH4+ contents), discussed in earlier works. Spatial and time variation in the latter-mentioned parameters was analysed on the basis of the data (2007÷2011, 2008÷2011). Generally, the behaviour of the underground water quality parameters has been found to vary in space and time. Thanks to the spatial analyses of the variation in the quality parameters in the Kłodzko underground water intake area some regularities (trends) in the variation in water quality have been identified.
Applying comparative fractal analysis to infer origin and process in channels on Earth and Mars
Balakrishnan, A.; Rice-Snow, S.; Hampton, B. A.
2010-12-01
Recently there has been a large amount of interest in identifying the nature of channels on (extra terrestrial) bodies. These studies are closely linked to the search for water (and ultimately signs of life) and are unarguably important. Current efforts in this direction rely on identifying geomorphic characteristics of these channels through painstaking analysis of multiple high resolution images. Here we present a new and simple technique that shows significant potential in its ability to distinguish between lava and water channels. Channels formed by water or lava on earth (as depicted in map view) display sinuosity over a large scale of range. Their geometries often point to the fluid dynamics, channel gradient, type of sediments in the river channels and for lava channels, it has been suggested that they are indicative of the thermal characteristics of the flow. The degree of this sinuosity in geometry can be measured using the divider method, and represented by fractal dimension (D) values. The higher D value corresponds to higher degree of sinuosity and channel irregularity and vice versa. Here we apply this fractal analysis to compare channels on Earth and Mars using D values extracted from satellite images. The fractal dimensions computed in this work for terrestrial river channels range from 1.04 - 1.38, terrestrial lava channels range from 1.01-1.10 and Martian channels range from 1.01 - 1.18. For terrestrial channels, preliminary results from river networks attain a fractal dimension greater than or equal to 1.1 while lava channels have fractal dimension less than or equal to 1.1. This analysis demonstrates the higher degree of irregularity present in rivers as opposed to lava channels and ratifies the utility of using fractal dimension to identify the source of channels on earth, and by extension, extra terrestrial bodies. Initial estimates of the fractal dimension from Mars fall within the same ranges as the lava channels on Earth. Based on what has
Directory of Open Access Journals (Sweden)
Lash Timothy L
2007-11-01
Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a
Crist, Courtney A; Duncan, Susan E; Gallagher, Daniel L
2016-08-26
We demonstrate a method for capturing emotional response to beverages and liquefied foods in a sensory evaluation laboratory using automated facial expression analysis (AFEA) software. Additionally, we demonstrate a method for extracting relevant emotional data output and plotting the emotional response of a population over a specified time frame. By time pairing each participant's treatment response to a control stimulus (baseline), the overall emotional response over time and across multiple participants can be quantified. AFEA is a prospective analytical tool for assessing unbiased response to food and beverages. At present, most research has mainly focused on beverages. Methodologies and analyses have not yet been standardized for the application of AFEA to beverages and foods; however, a consistent standard methodology is needed. Optimizing video capture procedures and resulting video quality aids in a successful collection of emotional response to foods. Furthermore, the methodology of data analysis is novel for extracting the pertinent data relevant to the emotional response. The combinations of video capture optimization and data analysis will aid in standardizing the protocol for automated facial expression analysis and interpretation of emotional response data.
Multivariate analysis applied to monthly rainfall over Rio de Janeiro state, Brazil
Brito, Thábata T.; Oliveira-Júnior, José F.; Lyra, Gustavo B.; Gois, Givanildo; Zeri, Marcelo
2016-10-01
Spatial and temporal patterns of rainfall were identified over the state of Rio de Janeiro, southeast Brazil. The proximity to the coast and the complex topography create great diversity of rainfall over space and time. The dataset consisted of time series (1967-2013) of monthly rainfall over 100 meteorological stations. Clustering analysis made it possible to divide the stations into six groups (G1, G2, G3, G4, G5 and G6) with similar rainfall spatio-temporal patterns. A linear regression model was applied to a time series and a reference. The reference series was calculated from the average rainfall within a group, using nearby stations with higher correlation (Pearson). Based on t-test (p < 0.05) all stations had a linear spatiotemporal trend. According to the clustering analysis, the first group (G1) contains stations located over the coastal lowlands and also over the ocean facing area of Serra do Mar (Sea ridge), a 1500 km long mountain range over the coastal Southeastern Brazil. The second group (G2) contains stations over all the state, from Serra da Mantiqueira (Mantiqueira Mountains) and Costa Verde (Green coast), to the south, up to stations in the Northern parts of the state. Group 3 (G3) contains stations in the highlands over the state (Serrana region), while group 4 (G4) has stations over the northern areas and the continent-facing side of Serra do Mar. The last two groups were formed with stations around Paraíba River (G5) and the metropolitan area of the city of Rio de Janeiro (G6). The driest months in all regions were June, July and August, while November, December and January were the rainiest months. Sharp transitions occurred when considering monthly accumulated rainfall: from January to February, and from February to March, likely associated with episodes of "veranicos", i.e., periods of 4-15 days of duration with no rainfall.
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
[Disinfection of water: on the need for analysis and solution of fundamental and applied problems].
Mokienko, A V
2014-01-01
In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection.
Energy Technology Data Exchange (ETDEWEB)
Moeglein, W. A.; Griswold, R.; Mehdi, B. L.; Browning, N. D.; Teuton, J.
2017-01-03
In-situ (scanning) transmission electron microscopy (S/TEM) is being developed for numerous applications in the study of nucleation and growth under electrochemical driving forces. For this type of experiment, one of the key parameters is to identify when nucleation initiates. Typically the process of identifying the moment that crystals begin to form is a manual process requiring the user to perform an observation and respond accordingly (adjust focus, magnification, translate the stage etc.). However, as the speed of the cameras being used to perform these observations increases, the ability of a user to “catch” the important initial stage of nucleation decreases (there is more information that is available in the first few milliseconds of the process). Here we show that video shot boundary detection (SBD) can automatically detect frames where a change in the image occurs. We show that this method can be applied to quickly and accurately identify points of change during crystal growth. This technique allows for automated segmentation of a digital stream for further analysis and the assignment of arbitrary time stamps for the initiation of processes that are independent of the user’s ability to observe and react.
The evolution of the Journal of Applied Oral Science: a bibliometric analysis.
Ferraz, Valéria Cristina Trindade; Amadei, José Roberto Plácido; Santos, Carlos Ferreira
2008-01-01
The purpose of this study was to make a brief diagnosis of the evolution of the Journal of Applied Oral Science (JAOS) between 2005 and 2007, by reviewing quantitative and qualitative aspects of the articles published in the JAOS within this period. All articles published in the JAOS in the time span established for this survey were analyzed retrospectively and a discussion was undertaken on the data referring to the main bibliometric indexes of production, authorship, bibliographic sources of the published articles, and the most frequently cited scientific journals in the main dental research fields. A total of 247 papers authored and co-authored by 1,139 contributors were reviewed, most of them being original research articles. The number of authors per article was 4.61 on the average. Regarding the geographic distribution, the authors represented almost all of the Brazilian States. Most published articles belonged to the following dental research fields: Endodontics, Restorative Dentistry, Dental Materials and Prosthodontics. The ranking of the most frequently cited scientific journals included the most reputable publications in these dental research fields. In conclusion, between 2005 and 2007, the JAOS either maintained or improved considerably its bibliometric indexes. The analysis of the data retrieved in this study allowed evaluating the journal's current management strategies, and identifying important issues that will help outlining the future directions for the internationalization of this journal.
Applying a sociolinguistic model to the analysis of informed consent documents.
Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel
2009-11-01
Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.
Directory of Open Access Journals (Sweden)
Brigitte JUANALS
2013-07-01
Full Text Available This paper describes the added value of an interdisciplinary and experimental approach applied to an analysis of the inter-organizational communication of influence. The field analyzed is the international industrial standardization of societal security. A communicational problem has been investigated with an experimental method based on natural language processing and knowledge management tools. The purpose of the methodological framework is to clarify the way international standards are designed and the policies that are supported by these standards. Furthermore, strategies of influence of public and private stakeholders involved in the NGOs which produce these texts have also been studied. The means of inter-organizational communication between organizations (companies or governmental authorities and NGOs can be compared to the lobbying developed in the context of the construction of Europe and globalization. Understanding the prescriptive process has become a crucial issue for States, organizations and citizens. This research contributes to the critical assessment of the new industrial policies currently being developed from the point of view of their characteristics and the way they have been designed.
Discriminant Analysis Applied to the Time—Frequency Energy Vector in Noisy Environment
Institute of Scientific and Technical Information of China (English)
TIANYe
2003-01-01
Robust speech detection in nolsy environment is an important front-end of speech processing such as speech recognition,speech enhancement and speech coding.Parameters frequently used for speech detection,such as the energy in time domain and the zero-crossing rate,exploit the properties of speech alone.Thus they show poor robustness to background noise.Speech detection in noisy environment should exploit the parameter with wihich speech and noise have maximum classification.In this paper,we propose a robust speech detection algorithm with heteroscedasitc discriminate analysis(HDA)applied to the time-frequency energy yector(TFEV).The TFEV consists of the log energy in time domain,the log energy in the fixed hand 250-3500 Hz.and the log Mel-scale frequency bands energy.Moreover,the bottom-up algorithm with automatic threshold adjustment is used for accurate word boundary detection.Compared to the algorithms based on the energy in time domain,the ATF parameter,the energy and the LDA-MFCC parameter,the proposed algorithm shows better performance under different types of noise.
Energy Technology Data Exchange (ETDEWEB)
Soares, Breno Almeida; Firme, Caio Lima, E-mail: firme.caio@gmail.com, E-mail: caiofirme@quimica.ufrn.br [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil). Instituto de Quimica; Maciel, Maria Aparecida Medeiros [Universidade Potiguar, Natal, RN (Brazil). Programa de Pos-graduacao em Biotecnologia; Kaiser, Carlos R. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Instituto de Quimica; Schilling, Eduardo; Bortoluzzi, Adailton J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Departamento de Quimica
2014-04-15
trans-Dehydrocrotonin (t-DCTN) a bioactive 19-nor-diterpenoid clerodane type isolated from Croton cajucara Benth, is one of the most investigated clerodane in the current literature. In this work, a new approach joining X-ray diffraction data, nuclear magnetic resonance (NMR) data and theoretical calculations was applied to the thorough characterization of t-DCTN. For that, the geometry of t-DCTN was reevaluated by X-ray diffraction as well as {sup 1}H and {sup 13}C NMR data, whose geometrical parameters where compared to those obtained from B3LYP/6-311G++(d,p) level of theory. From the evaluation of both calculated and experimental values of {sup 1}H and {sup 13}C NMR chemical shifts and spin-spin coupling constants, it was found very good correlations between theoretical and experimental magnetic properties of t-DCTN. Additionally, the delocalization indexes between hydrogen atoms correlated accurately with theoretical and experimental spin-spin coupling constants. An additional topological analysis from quantum theory of atoms in molecules (QTAIM) showed intramolecular interactions for t-DCTN. (author)
Terrill, Philip Ian; Wilson, Stephen James; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn
2010-05-01
Breathing patterns are characteristically different between infant active sleep (AS) and quiet sleep (QS), and statistical quantifications of interbreath interval (IBI) data have previously been used to discriminate between infant sleep states. It has also been identified that breathing patterns are governed by a nonlinear controller. This study aims to investigate whether nonlinear quantifications of infant IBI data are characteristically different between AS and QS, and whether they may be used to discriminate between these infant sleep states. Polysomnograms were obtained from 24 healthy infants at six months of age. Periods of AS and QS were identified, and IBI data extracted. Recurrence quantification analysis (RQA) was applied to each period, and recurrence calculated for a fixed radius in the range of 0-8 in steps of 0.02, and embedding dimensions of 4, 6, 8, and 16. When a threshold classifier was trained, the RQA variable recurrence was able to correctly classify 94.3% of periods in a test dataset. It was concluded that RQA of IBI data is able to accurately discriminate between infant sleep states. This is a promising step toward development of a minimal-channel automatic sleep state classification system.
New Methods for Timing Analysis of Transient Events, Applied to Fermi/GBM Magnetar Bursts
Huppenkothen, Daniela; Uttley, Phil; van der Horst, Alexander J; van der Klis, Michiel; Kouveliotou, Chryssa; Gogus, Ersin; Granot, Jonathan; Vaughan, Simon; Finger, Mark H
2013-01-01
In order to discern the physical nature of many gamma-ray sources in the sky, we must look not only in spectral and spatial dimensions, but also understand their temporal variability. However, timing analysis of sources with a highly transient nature, such as magnetar bursts, is difficult: standard Fourier techniques developed for long-term variability generally observed, for example, from AGN often do not apply. Here, we present newly developed timing methods applicable to transient events of all kinds, and show their successful application to magnetar bursts observed with Fermi/GBM. Magnetars are a prime subject for timing studies, thanks to the detection of quasi-periodicities in magnetar Giant Flares and their potential to help shed light on the structure of neutron stars. Using state-of-the art statistical techniques, we search for quasi-periodicities (QPOs) in a sample of bursts from Soft Gamma Repeater SGR J0501+4516 observed with Fermi/GBM and provide upper limits for potential QPO detections. Additio...