WorldWideScience

Sample records for applying geostatistical analysis

  1. Applying Geostatistical Analysis to Crime Data: Car-Related Thefts in the Baltic States

    OpenAIRE

    Kerry, Ruth; Goovaerts, Pierre; Haining, Robert P; Ceccato, Vania

    2010-01-01

    Geostatistical methods have rarely been applied to area-level offense data. This article demonstrates their potential for improving the interpretation and understanding of crime patterns using previously analyzed data about car-related thefts for Estonia, Latvia, and Lithuania in 2000. The variogram is used to inform about the scales of variation in offense, social, and economic data. Area-to-area and area-to-point Poisson kriging are used to filter the noise caused by the small number proble...

  2. Applying Geostatistical Analysis to Crime Data: Car-Related Thefts in the Baltic States.

    Science.gov (United States)

    Kerry, Ruth; Goovaerts, Pierre; Haining, Robert P; Ceccato, Vania

    2010-01-01

    Geostatistical methods have rarely been applied to area-level offense data. This article demonstrates their potential for improving the interpretation and understanding of crime patterns using previously analyzed data about car-related thefts for Estonia, Latvia, and Lithuania in 2000. The variogram is used to inform about the scales of variation in offense, social, and economic data. Area-to-area and area-to-point Poisson kriging are used to filter the noise caused by the small number problem. The latter is also used to produce continuous maps of the estimated crime risk (expected number of crimes per 10,000 habitants), thereby reducing the visual bias of large spatial units. In seeking to detect the most likely crime clusters, the uncertainty attached to crime risk estimates is handled through a local cluster analysis using stochastic simulation. Factorial kriging analysis is used to estimate the local- and regional-scale spatial components of the crime risk and explanatory variables. Then regression modeling is used to determine which factors are associated with the risk of car-related theft at different scales. PMID:22190762

  3. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    Science.gov (United States)

    Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.

    1999-05-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.

  4. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    The geomagnetic field varies on a variety of time- and length scales, which are only rudimentary considered in most present field models. The part of the observed field that can not be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty (which...... consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...... on 5 years of Ørsted and CHAMP data, and includes secular variation and acceleration, as well as low-degree external (magnetospheric) and induced fields. The analysis is done in order to find the statistical behaviour of the space-time structure of the residuals, as a proxy for the data covariances...

  5. Geostatistics and spatial analysis in biological anthropology.

    Science.gov (United States)

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology.

  6. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  7. Geostatistics and Analysis of Spatial Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2007-01-01

    This note deals with geostatistical measures for spatial correlation, namely the auto-covariance function and the semi-variogram, as well as deterministic and geostatistical methods for spatial interpolation, namely inverse distance weighting and kriging. Some semi-variogram models are mentioned...

  8. Geostatistics applied to estimation of uranium bearing ore reserves

    International Nuclear Information System (INIS)

    A computer assisted method for assessing uranium-bearing ore deposit reserves is analyzed. Determinations of quality-thickness, namely quality by thickness calculations of mineralization, were obtained by means of a mathematical method known as the theory of rational variables for each drill-hole layer. Geostatistical results were derived based on a Fortrand computer program on a DEC 20/40 system. (author)

  9. The extension of geostatistical spatial analysis model and its application to datum land appraisal

    Science.gov (United States)

    Fu, Feihong; Li, Xuefei; Zou, Rong

    2007-06-01

    Geostatistical method can reflect quantitatively variable spatial distribution characteristic, and through produces many different theoretical models to reflect quantitatively the uncertain attribute because of lacking material. But geostatistics is taken a new discipline, it also exists the probability of extension. The extension of ordinary geostatistics includes mainly three aspects: the treatment of outliers in geostatistical spatial data, fitting the variogram and selecting Kriging estimate neighborhood. And it introduces the basic mentality of applying geostatistical space analytical model to appraise datum land price base on analyzing the feasibility.

  10. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  11. Geostatistical Soil Data Analysis II. Optimal interpolation with kriging

    Directory of Open Access Journals (Sweden)

    Boško Miloš

    2001-06-01

    Full Text Available The optimal interpolation by using the technique of ordinary kriging, based on the regionalised variable theory, is described and illustrated by a case study of the top-soil (depth 0-30cm CaCO3 and humus content in the 136 pedons at Petrovo polje, located in dalmatinska Zagora. The spatial variability of the CaCO3 and humus content shown by threedimensional diagrams of the kriged estimates and associated standard errors diagrams may be consequence of different geology, topography and hidrology of the area. Reliability diagrams or maps of the kriged estimation errors, migh be useful in locating areas of large errors, to define optimal sampling strategy for different levels of estimation precision. The results of investigation showed that, by using anisotropic semivariogram models, applied geostatistical analysis can be used to improve local estimation precision of the selected soil properties. For this reason, kriging can be recommended, as a convenient tool, for quantitative estimating and mapping of the individual soil properties at unsampled locations.

  12. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    Science.gov (United States)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  13. Geostatistical techniques applied to mapping limnological variables and quantify the uncertainty associated with estimates

    Directory of Open Access Journals (Sweden)

    Cristiano Cigagna

    2015-12-01

    Full Text Available Abstract Aim: This study aimed to map the concentrations of limnological variables in a reservoir employing semivariogram geostatistical techniques and Kriging estimates for unsampled locations, as well as the uncertainty calculation associated with the estimates. Methods: We established twenty-seven points distributed in a regular mesh for sampling. Then it was determined the concentrations of chlorophyll-a, total nitrogen and total phosphorus. Subsequently, a spatial variability analysis was performed and the semivariogram function was modeled for all variables and the variographic mathematical models were established. The main geostatistical estimation technique was the ordinary Kriging. The work was developed with the estimate of a heavy grid points for each variables that formed the basis of the interpolated maps. Results: Through the semivariogram analysis was possible to identify the random component as not significant for the estimation process of chlorophyll-a, and as significant for total nitrogen and total phosphorus. Geostatistical maps were produced from the Kriging for each variable and the respective standard deviations of the estimates calculated. These measurements allowed us to map the concentrations of limnological variables throughout the reservoir. The calculation of standard deviations provided the quality of the estimates and, consequently, the reliability of the final product. Conclusions: The use of the Kriging statistical technique to estimate heavy mesh points associated with the error dispersion (standard deviation of the estimate, made it possible to make quality and reliable maps of the estimated variables. Concentrations of limnological variables in general were higher in the lacustrine zone and decreased towards the riverine zone. The chlorophyll-a and total nitrogen correlated comparing the grid generated by Kriging. Although the use of Kriging is more laborious compared to other interpolation methods, this

  14. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    Science.gov (United States)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    ), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.

  15. Geostatistics applied to the study of the spatial distribution of Tibraca limbativentris in flooded rice fields

    Directory of Open Access Journals (Sweden)

    Juliano de Bastos Pazini

    2015-06-01

    Full Text Available Tibraca limbativentris (rice stem bug is an insect highly injurious to the rice crop in Brazil. The aim of this research was to define the spatial distribution of the T. limbativentris and improve the sampling process by means of geostatistical application techniques and construction of prediction maps in a flooded rice field located in the "Planalto da Campanha" Region, Rio Grande do Sul (RS, Brazil. The experiments were conducted in rice crop in the municipality of Itaqui - RS, in the crop years of 2009/10, 2010/11 and 2011/12, counting fortnightly the number of nymphs and adults in a georeferenced grid with points spaced at 50m in the first year and in 10m in the another years. It was performed a geostatistical analysis by means adjusting semivariogram and interpolation of numeric data by kriging to verify the spatial dependence and the subsequent mapping population. The results obtained indicated that the rice stem bug, T. limbativentris, has a strong spatial dependence. The prediction maps allow estimating population density of the pest and visualization of the spatial distribution in flooded rice fields, enabling the improvement of the traditional method of sampling for rice stem bug

  16. A geostatistical method applied to the geochemical study of the Chichinautzin Volcanic Field in Mexico

    Science.gov (United States)

    Robidoux, P.; Roberge, J.; Urbina Oviedo, C. A.

    2011-12-01

    The origin of magmatism and the role of the subducted Coco's Plate in the Chichinautzin volcanic field (CVF), Mexico is still a subject of debate. It has been established that mafic magmas of alkali type (subduction) and calc-alkali type (OIB) are produced in the CVF and both groups cannot be related by simple fractional crystallization. Therefore, many geochemical studies have been done, and many models have been proposed. The main goal of the work present here is to provide a new tool for the visualization and interpretation of geochemical data using geostatistics and geospatial analysis techniques. It contains a complete geodatabase built from referred samples over the 2500 km2 area of CVF and its neighbour stratovolcanoes (Popocatepetl, Iztaccihuatl and Nevado de Toluca). From this database, map of different geochemical markers were done to visualise geochemical signature in a geographical manner, to test the statistic distribution with a cartographic technique and highlight any spatial correlations. The distribution and regionalization of the geochemical signatures can be viewed in a two-dimensional space using a specific spatial analysis tools from a Geographic Information System (GIS). The model of spatial distribution is tested with Linear Decrease (LD) and Inverse Distance Weight (IDW) interpolation technique because they best represent the geostatistical characteristics of the geodatabase. We found that ratio of Ba/Nb, Nb/Ta, Th/Nb show first order tendency, which means visible spatial variation over a large scale area. Monogenetic volcanoes in the center of the CVF have distinct values compare to those of the Popocatepetl-Iztaccihuatl polygenetic complex which are spatially well defined. Inside the Valley of Mexico, a large quantity of monogenetic cone in the eastern portion of CVF has ratios similar to the Iztaccihuatl and Popocatepetl complex. Other ratios like alkalis vs SiO2, V/Ti, La/Yb, Zr/Y show different spatial tendencies. In that case, second

  17. Bayesian Analysis of Geostatistical Models With an Auxiliary Lattice

    KAUST Repository

    Park, Jincheol

    2012-04-01

    The Gaussian geostatistical model has been widely used for modeling spatial data. However, this model suffers from a severe difficulty in computation: it requires users to invert a large covariance matrix. This is infeasible when the number of observations is large. In this article, we propose an auxiliary lattice-based approach for tackling this difficulty. By introducing an auxiliary lattice to the space of observations and defining a Gaussian Markov random field on the auxiliary lattice, our model completely avoids the requirement of matrix inversion. It is remarkable that the computational complexity of our method is only O(n), where n is the number of observations. Hence, our method can be applied to very large datasets with reasonable computational (CPU) times. The numerical results indicate that our model can approximate Gaussian random fields very well in terms of predictions, even for those with long correlation lengths. For real data examples, our model can generally outperform conventional Gaussian random field models in both prediction errors and CPU times. Supplemental materials for the article are available online. © 2012 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  18. Geostatistical analysis of prevailing groundwater conditions and potential solute migration at Elstow, Bedfordshire

    International Nuclear Information System (INIS)

    A geostatistical approach is applied in a study of the potential migration of contaminants from a hypothetical waste disposal facility near Elstow, Bedfordshire. A deterministic numerical model of groundwater flow in the Kellaways Sands formation and adjacent layers is coupled with geostatistical simulation of the heterogeneous transmissivity field of this principal formation. A particle tracking technique is used to predict the migration pathways for alternative realisations of flow. Alternative statistical descriptions of the spatial structure of the transmissivity field are implemented and the temporal and spatial distributions of escape of contaminants to the biosphere are investigated. (author)

  19. Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China

    Directory of Open Access Journals (Sweden)

    Bao-Guo Li

    2012-03-01

    Full Text Available Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.

  20. GEOSTATISTICS APPLIED TO THE STUDY OF SOIL PHYSIOCHEMICAL CHARACTERISTICS IN SEASONAL DECIDUOUS FOREST AREAS

    OpenAIRE

    Eleandro J. Brun; Carlos R. S. da Silva; Sandro Vaccaro; Rubens M. Rondon Neto

    2010-01-01

    Methods of geostatistics were used in the identification of size and structure of space variability of some physiochemical attributes of soils under seasonal deciduous forest areas, which were called mature forest, secondary forest and “capoeirão”. The areas, located in Santa Tereza, RS, were sampled during the period of 2002 and 2003, comprising the soil classes: Argiluvic Chernosol, Cambisol Ta and Litholic Neosol. Systematic sampling was performed with regular spacing grid of points varyin...

  1. A Practical Primer on Geostatistics

    Science.gov (United States)

    Olea, Ricardo A.

    2009-01-01

    significant methodological implications. HISTORICAL REMARKS As a discipline, geostatistics was firmly established in the 1960s by the French engineer Georges Matheron, who was interested in the appraisal of ore reserves in mining. Geostatistics did not develop overnight. Like other disciplines, it has built on previous results, many of which were formulated with different objectives in various fields. PIONEERS Seminal ideas conceptually related to what today we call geostatistics or spatial statistics are found in the work of several pioneers, including: 1940s: A.N. Kolmogorov in turbulent flow and N. Wiener in stochastic processing; 1950s: D. Krige in mining; 1960s: B. Mathern in forestry and L.S. Gandin in meteorology CALCULATIONS Serious applications of geostatistics require the use of digital computers. Although for most geostatistical techniques rudimentary implementation from scratch is fairly straightforward, coding programs from scratch is recommended only as part of a practice that may help users to gain a better grasp of the formulations. SOFTWARE For professional work, the reader should employ software packages that have been thoroughly tested to handle any sampling scheme, that run as efficiently as possible, and that offer graphic capabilities for the analysis and display of results. This primer employs primarily the package Stanford Geomodeling Software (SGeMS) - recently developed at the Energy Resources Engineering Department at Stanford University - as a way to show how to obtain results practically. This applied side of the primer should not be interpreted as the notes being a manual for the use of SGeMS. The main objective of the primer is to help the reader gain an understanding of the fundamental concepts and tools in geostatistics. ORGANIZATION OF THE PRIMER The chapters of greatest importance are those covering kriging and simulation. All other materials are peripheral and are included for better comprehension of th

  2. Incorporating temporal variability to improve geostatistical analysis of satellite-observed CO2 in China

    Institute of Scientific and Technical Information of China (English)

    ZENG ZhaoCheng; LEI LiPing; GUO LiJie; ZHANG Li; ZHANG Bing

    2013-01-01

    Observations of atmospheric carbon dioxide (CO2) from satellites offer new data sources to understand global carbon cycling.The correlation structure of satellite-observed CO2 can be analyzed and modeled by geostatistical methods,and CO2 values at unsampled locations can be predicted with a correlation model.Conventional geostatistical analysis only investigates the spatial correlation of CO2,and does not consider temporal variation in the satellite-observed CO2 data.In this paper,a spatiotemporal geostatistical method that incorporates temporal variability is implemented and assessed for analyzing the spatiotemporal correlation structure and prediction of monthly CO2 in China.The spatiotemporal correlation is estimated and modeled by a product-sum variogram model with a global nugget component.The variogram result indicates a significant degree of temporal correlation within satellite-observed CO2 data sets in China.Prediction of monthly CO2 using the spatiotemporal variogram model and spacetime kriging procedure is implemented.The prediction is compared with a spatial-only geostatistical prediction approach using a cross-validation technique.The spatiotemporal approach gives better results,with higher correlation coefficient (r2),and less mean absolute prediction error and root mean square error.Moreover,the monthly mapping result generated from the spatiotemporal approach has less prediction uncertainty and more detailed spatial variation of CO2 than those from the spatial-only approach.

  3. Estimation of historical groundwater contaminant distribution using the adjoint state method applied to geostatistical inverse modeling

    Science.gov (United States)

    Michalak, Anna M.; Kitanidis, Peter K.

    2004-08-01

    As the incidence of groundwater contamination continues to grow, a number of inverse modeling methods have been developed to address forensic groundwater problems. In this work the geostatistical approach to inverse modeling is extended to allow for the recovery of the antecedent distribution of a contaminant at a given point back in time, which is critical to the assessment of historical exposure to contamination. Such problems are typically strongly underdetermined, with a large number of points at which the distribution is to be estimated. To address this challenge, the computational efficiency of the new method is increased through the application of the adjoint state method. In addition, the adjoint problem is presented in a format that allows for the reuse of existing groundwater flow and transport codes as modules in the inverse modeling algorithm. As demonstrated in the presented applications, the geostatistical approach combined with the adjoint state method allow for a historical multidimensional contaminant distribution to be recovered even in heterogeneous media, where a numerical solution is required for the forward problem.

  4. Multivariate analysis and geostatistics of the fertility of a humic rhodic hapludox under coffee cultivation

    Directory of Open Access Journals (Sweden)

    Samuel de Assis Silva

    2012-04-01

    Full Text Available The spatial variability of soil and plant properties exerts great influence on the yeld of agricultural crops. This study analyzed the spatial variability of the fertility of a Humic Rhodic Hapludox with Arabic coffee, using principal component analysis, cluster analysis and geostatistics in combination. The experiment was carried out in an area under Coffea arabica L., variety Catucai 20/15 - 479. The soil was sampled at a depth 0.20 m, at 50 points of a sampling grid. The following chemical properties were determined: P, K+, Ca2+, Mg2+, Na+, S, Al3+, pH, H + Al, SB, t, T, V, m, OM, Na saturation index (SSI, remaining phosphorus (P-rem, and micronutrients (Zn, Fe, Mn, Cu and B. The data were analyzed with descriptive statistics, followed by principal component and cluster analyses. Geostatistics were used to check and quantify the degree of spatial dependence of properties, represented by principal components. The principal component analysis allowed a dimensional reduction of the problem, providing interpretable components, with little information loss. Despite the characteristic information loss of principal component analysis, the combination of this technique with geostatistical analysis was efficient for the quantification and determination of the structure of spatial dependence of soil fertility. In general, the availability of soil mineral nutrients was low and the levels of acidity and exchangeable Al were high.

  5. Imprecise (fuzzy) information in geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in a fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.

  6. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given.

  7. Geostatistical analysis of potentiometric data in Wolfcamp aquifer of the Palo Duro Basin, Texas

    International Nuclear Information System (INIS)

    This report details a geostatistical analysis of potentiometric data from the Wolfcamp aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Wolfcamp data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data

  8. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  9. Geostatistical analysis of field hydraulic conductivity in compacted clay

    Energy Technology Data Exchange (ETDEWEB)

    Rogowski, A.S.; Simmons, D.E.

    1988-05-01

    Hydraulic conductivity (K) of fractured or porous materials is associated intimately with water flow and chemical transport. Basic concepts imply uniform flux through a homogeneous cross-sectional area. If flow were to occur only through part of the area, actual rates could be considerably different. Because laboratory values of K in compacted clays seldom agree with field estimates, questions arise as to what the true values of K are and how they should be estimated. Hydraulic conductivity values were measured on a 10 x 25 m elevated bridge-like platform. A constant water level was maintained for 1 yr over a 0.3-m thick layer of compacted clay, and inflow and outflow rates were monitored using 10 x 25 grids of 0.3-m diameter infiltration rings and outflow drains subtending approximately 1 x 1 m blocks of compacted clay. Variography of inflow and outflow data established relationships between cores and blocks of clay, respectively. Because distributions of outflow rates were much less and bore little resemblance to the distributions of break-through rates based on tracer studies, presence of macropores and preferential flow through the macropores was suspected. Subsequently, probability kriging was applied to reevaluate distribution of flux rates and possible location of macropores. Sites exceeding a threshold outflow of 100 x 10/sup -9/ m/s were classified as outliers and were assumed to probably contain a significant population of macropores. Different sampling schemes were examined. Variogram analysis of outflows with and without outliers suggested adequacy of sampling the site at 50 randomly chosen locations. Because of the potential contribution of macropores to pollutant transport and the practical necessity of extrapolating small plot values to larger areas, conditional simulations with and without outliers were carried out.

  10. Geostatistical methods for radiological evaluation and risk analysis of contaminated premises

    International Nuclear Information System (INIS)

    Full text: At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires the radiological assessment of residual activity levels of building structures. As stated by the IAEA, 'Segregation and characterization of contaminated materials are the key elements of waste minimization'. From this point of view, the set up of an appropriate evaluation methodology is of primordial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical, functional and qualitative information. Then, a systematic (exhaustive or not) control of the emergent signal is performed by means of in situ measurement methods such as surface control device combined with in situ gamma spectrometry. Besides, in order to assess the contamination depth, samples can be collected from boreholes at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data improve and reinforce the preliminary waste zoning. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. In this case, geostatistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Besides, the ability of this geostatistical framework to provide answers to several key issues that generally occur during the clean-up preparation phase is discussed: How to optimise the investigation costs? How to deal with data quality issues? How to consistently take into account auxiliary information such as historical

  11. Geostatistical analysis of heavy metal distributions in soils; Geostatistische Analyse von Schwermetallverteilungen in Boeden

    Energy Technology Data Exchange (ETDEWEB)

    Geiler, H.; Dengel, H.S.; Donsbach, A.; Maurer, W. [Siegen Univ. (Gesamthochschule) (Germany). Analytische Chemie II; Knoblich, K.; Aschenbrenner, F. [Giessen Univ. (Germany). Inst. fuer Angewandte Geowissenschaften; Ostermann, R. [Siegen Univ. (Gesamthochschule) (Germany). Hochschulrechenzentrum

    1998-11-01

    Soil samples were taken from a test area of 1 km{sup 2} in a regular pattern at specified depths. The samples were analyzed in respect of heavy metal concentrations. The geologically homogeneous area is located close to the urban area of Siegen and covered with different forest vegetations and meadows. There is no direct influence of industrial emissions. The soil samples were taken at three different depths up to 1 m below surface and digested with aqua regia. The analysis comprised cadmium, chromium, copper, nickel, lead and zinc. Geostatistics was applied in order to ascertain the spatial distribution of heavy metal concentrations in the soil. This was accomplished by adjusting the data on a suitable semivariogram and using kriging as an interpolation algorithm. As a result of the investigations a minimum number of soil samples can be extracted that is sufficient to meet a practical degree of accuracy concerning the reproductivity of the concentration pattern of heavy metal concentrations in the soil. (orig.) [Deutsch] Der Boden einer 1 km{sup 2} grossen Modellflaeche wurde rastermaessig und tiefenorientiert beprobt und auf seine Schwermetallgehalte hin analysiert. Das betreffende Gelaende am Standrand von Siegen ist geologisch einheitlich aufgebaut. Der Bewuchs besteht aus Laub- und Nadelwald sowie Wiese. Das Gebiet steht nicht unter einem unmittelbaren Industrieeinfluss. Aus drei verschiedenen Entnahmetiefen bis zu 1 m sind nach einem Koenigswasseraufschluss die jeweils 100 Bodenproben auf Cadmium, Chrom, Kupfer, Nickel, Blei und Zink untersucht worden. Geostatistische Verfahren (Anpassung an Semivariogramme, Schaetzung mittels Kriging) wurden herangezogen, um den raeumlichen Zusammenhang der ermittelten Schwermetallkonzentrationen bzw. die Verteilungsmuster zu erfassen. Damit gelang es, fuer praxisnahe Genauigkeitsansprueche flaechendeckender Bodenbewertungen eine Mindestzahl von Beprobungsstellen zu bestimmen, die fuer eine repraesentative Aussage ueber das

  12. Geostatistical analysis of potentiometric data in the Pennsylvanian aquifer of the Palo Duro Basin, Texas

    International Nuclear Information System (INIS)

    This report details a geostatistical analysis of potentiometric data from the Pennsylvanian aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Pennsylvanian data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data. The analysis is restricted to that portion of the Pennsylvanian aquifer that lies to the southwest of the Amarillo Uplift. The Pennsylvanian is absent is some areas across the uplift and data to the northeast were not used in this analysis. The surfaces produced in that analysis are included for comparison. 9 refs., 15 figs

  13. Geostatistical Analysis of County-Level Lung Cancer Mortality Rates in the Southeastern United States.

    Science.gov (United States)

    Goovaerts, Pierre

    2010-01-01

    The analysis of health data and putative covariates, such as environmental, socioeconomic, demographic, behavioral, or occupational factors, is a promising application for geostatistics. Transferring methods originally developed for the analysis of earth properties to health science, however, presents several methodological and technical challenges. These arise because health data are typically aggregated over irregular spatial supports (e.g., counties) and consist of a numerator and a denominator (i.e., rates). This article provides an overview of geostatistical methods tailored specifically to the characteristics of areal health data, with an application to lung cancer mortality rates in 688 U.S. counties of the southeast (1970-1994). Factorial Poisson kriging can filter short-scale variation and noise, which can be large in sparsely populated counties, to reveal similar regional patterns for male and female cancer mortality that correlate well with proximity to shipyards. Rate uncertainty was transferred through local cluster analysis using stochastic simulation, allowing the computation of the likelihood of clusters of low or high cancer mortality. Accounting for population size and rate uncertainty led to the detection of new clusters of high mortality around Oak Ridge National Laboratory for both sexes, in counties with high concentrations of pig farms and paper mill industries for males (occupational exposure) and in the vicinity of Atlanta for females. PMID:20445829

  14. Geostatistical Analysis on the Temporal Patterns of the Yellow Rice Borer, Tryporyza incertulas

    Institute of Scientific and Technical Information of China (English)

    YUAN Zhe-ming; WANG Zhi; HU Xiang-yue

    2005-01-01

    In order to comprehend temporal pattern of the larvae population of the yellow rice borer, Tryporyza incertulas, and provide valuable information for its forecast model, the data series of the population for each generation and the over-wintered larvae from 1960 to 1990 in Dingcheng District, Changde City, Hunan Province, were analyzed with geostatistics. The data series of total number,the 1 st generation, the 3rd generation and the over-wintered larvae year to year displayed rather better autocorrelation and prediction.The data series of generation to generation, the 2nd generation and the 4th generation year to year, however, demonstrated poor autocorrelation, especially for the 4th generation, whose autocorrelation degree was zero. The population dynamics of the yellow rice borer was obviously intermittent. A remarkable cycle of four generations, one year, was observed in the population of generation to generation. Omitting the certain generation or interposing the over-wintered larvae only resulted in a less or slight change of autocorrelation of the whole data series generation to generation. Crop system, food, climate and natural enemies, therefore, played more important roles in regulating the population dynamics than base number of the larvae. The basic techniques of geostatistics applied in analyzing temporal population dynamics were outlined.

  15. Geostatistical analysis of Landsat-TM lossy compression images in a high-performance computing environment

    Science.gov (United States)

    Pesquer, Lluís; Cortés, Ana; Serral, Ivette; Pons, Xavier

    2011-11-01

    The main goal of this study is to characterize the effects of lossy image compression procedures on the spatial patterns of remotely sensed images, as well as to test the performance of job distribution tools specifically designed for obtaining geostatistical parameters (variogram) in a High Performance Computing (HPC) environment. To this purpose, radiometrically and geometrically corrected Landsat-5 TM images from April, July, August and September 2006 were compressed using two different methods: Band-Independent Fixed-Rate (BIFR) and three-dimensional Discrete Wavelet Transform (3d-DWT) applied to the JPEG 2000 standard. For both methods, a wide range of compression ratios (2.5:1, 5:1, 10:1, 50:1, 100:1, 200:1 and 400:1, from soft to hard compression) were compared. Variogram analyses conclude that all compression ratios maintain the variogram shapes and that the higher ratios (more than 100:1) reduce variance in the sill parameter of about 5%. Moreover, the parallel solution in a distributed environment demonstrates that HPC offers a suitable scientific test bed for time demanding execution processes, as in geostatistical analyses of remote sensing images.

  16. A software tool for geostatistical analysis of thermal response test data: GA-TRT

    Science.gov (United States)

    Focaccia, Sara; Tinti, Francesco; Bruno, Roberto

    2013-09-01

    In this paper we present a new method (DCE - Drift and Conditional Estimation), coupling Infinite Line Source (ILS) theory with geostatistics, to interpret thermal response test (TRT) data and the relative implementing user-friendly software (GA-TRT). Many methods (analytical and numerical) currently exist to analyze TRT data. The innovation derives from the fact that we use a probabilistic approach, able to overcome, without excessively complicated calculations, many interpretation problems (choice of the guess value of ground volumetric heat capacity, identification of the fluctuations of recorded data, inability to provide a measure of the precision of the estimates obtained) that cannot be solved otherwise. The new procedure is based on a geostatistical drift analysis of temperature records which leads to a precise equivalent ground thermal conductivity (λg) estimation, confirmed by the calculation of its estimation variance. Afterwards, based on λg, a monovariate regression on the original data allows for the identification of the theoretical relationship between ground volumetric heat capacity (cg) and borehole thermal resistance (Rb). By assuming the monovariate Probability Distribution Function (PDF) for each variable, the joint conditional PDF to the cg-Rb relationship is found; finally, the conditional expectation allows for the identification of the correct and optimal couple of the cg-Rb estimated values.

  17. A geostatistical analysis of IBTS data for age 2 North Sea haddock ( Melanogrammus aeglefinus ) considering daylight effects

    DEFF Research Database (Denmark)

    Wieland, Kai; Rivoirard, J.

    2001-01-01

    A geostatistical analysis of age 2 North Sea haddock catches from the 1st quarter IBTS (International Bottom Trawl Survey) 1983-1997 is presented. IBTS standard abundance indices are routinely calculated in a way that does not account explicitly for the spatial distribution and night hauls...

  18. 4th International Geostatistics Congress

    CERN Document Server

    1993-01-01

    The contributions in this book were presented at the Fourth International Geostatistics Congress held in Tróia, Portugal, in September 1992. They provide a comprehensive account of the current state of the art of geostatistics, including recent theoretical developments and new applications. In particular, readers will find descriptions and applications of the more recent methods of stochastic simulation together with data integration techniques applied to the modelling of hydrocabon reservoirs. In other fields there are stationary and non-stationary geostatistical applications to geology, climatology, pollution control, soil science, hydrology and human sciences. The papers also provide an insight into new trends in geostatistics particularly the increasing interaction with many other scientific disciplines. This book is a significant reference work for practitioners of geostatistics both in academia and industry.

  19. Geostatistical stability analysis of co-depositional sand-thickened tailings embankments

    Energy Technology Data Exchange (ETDEWEB)

    Elkateb, T. [Thurber Engineering Ltd., Edmonton, AB (Canada); Chalaturnyk, R.; Robertson, P.K. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering

    2003-07-01

    Co-deposition is a novel technique for the disposal of thickened tailings pockets. In co-deposition, tailings are randomly distributed within a bigger mass of sand. The oil sands industry of Alberta is currently considering using this technique. This paper describes the attempt that was made to assess the engineering behaviour of this tailing disposal system in a probabilistic analysis framework. Several realizations of co-depositional embankments were generated using geostatistical theories. In turn, the stability of the disposal system expressed in terms of factors of safety against shear failure and the associated vertical deformations was assessed using these realizations and FLAC software. A sensitivity to embankment characteristics was revealed by failure probabilities and vertical displacements, such as embankment height and side slopes, and undrained shear strength of thickened tailings. The authors proposed an allowable failure probability of 17 per cent for these embankments to avoid irreparable excessive deformations. 11 refs., 1 tab., 8 figs.

  20. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...

  1. Thin sand modeling based on geostatistic, uncertainty and risk analysis in Zuata Principal field, Orinoco oil belt

    Energy Technology Data Exchange (ETDEWEB)

    Cardona, W.; Aranaga, R.; Siu, P.; Perez, L. [PDVSA Petroleos de Venezuela SA, Caracas (Venezuela, Bolivarian Republic of)

    2009-07-01

    The geological modelling of the Zuata Principal field in Venezuela, particularly the Junin Block 2 belonging to Orinoco oil belt, is a challenge because of the presence of thin sand bodies in an unexploited zone. This paper presented the results obtained from a horizontal well that contacted 96 per cent of pay count sand in the field. Geostatistical modelling and sensibility analysis were used for planning the well. The model was generated by processing and interpreting information from production and exploratory fishbones. Information provided by nearby wildcat wells suggested that the proposed area was not prospective. However, information provided by several exploratory fishbones offered some possibility of draining additional reserves. From available information, facies models and uncertainty analysis were made to statistically determine the best option, notably to drill additional stratwells to obtain a more accurate characterization or apply the already obtained model for drilling a production well in the investigated area. The study showed that geological uncertainty does not only depend on how much information is available, but also on how this information can be processed and interpreted. Decision analysis provides a rational basis for dealing with risk and uncertainties. 4 refs., 7 tabs., 7 figs., 1 appendix.

  2. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    Energy Technology Data Exchange (ETDEWEB)

    Wahid, Ali, E-mail: ali.wahid@live.com; Salim, Ahmed Mohamed Ahmed, E-mail: mohamed.salim@petronas.com.my; Yusoff, Wan Ismail Wan, E-mail: wanismail-wanyusoff@petronas.com.my [Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 32610 Tronoh, Perak (Malaysia); Gaafar, Gamal Ragab, E-mail: gaafargr@gmail.com [Petroleum Engineering Division, PETRONAS Carigali Sdn Bhd, Kuala Lumpur (Malaysia)

    2016-02-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.

  3. Geostatistical analysis of variations in soil salinity in atypical irrigation area in Xinjiang, northwest China

    Institute of Scientific and Technical Information of China (English)

    2016-01-01

    Characterizing spatial and temporal variability of soil salinity is tremendously important for a variety of agronomic andenvironmental concerns in arid irrigation areas. This paper reviews the characteristics and spatial and temporal variationsof soil salinization in the Ili River Irrigation Area by applying a geostatistical approach. Results showed that: (1) the soilsalinity varied widely, with maximum value of 28.10 g/kg and minimum value of 0.10 g/kg, and was distributed mainly atthe surface soil layer. Anions were mainly SO42- and Cl-, while cations were mainly Na+ and Ca2+; (2) the abundance ofsalinity of the root zone soil layer for different land use types was in the following order: grassland 〉 cropland 〉 forestland.The abundance of salinity of root zone soil layers for different periods was in the following order: March 〉 June 〉 September;(3) the spherical model was the most suitable variogram model to describe the salinity of the 0-3 cm and 3-20 cmsoil layers in March and June, and the 3-20 cm soil layer in September, while the exponential model was the most suitablevariogram model to describe the salinity of the 0-3 cm soil layer in September. Relatively strong spatial and temporalstructure existed for soil salinity due to lower nugget effects; and (4) the maps of kriged soil salinity showed that higher soilsalinity was distributed in the central parts of the study area and lower soil salinity was distributed in the marginal parts.Soil salinity tended to increase from the marginal parts to the central parts across the study area. Applying the krigingmethod is very helpful in detecting the problematic areas and is a good tool for soil resources management. Managingefforts on the appropriate use of soil and water resources in such areas is very important for sustainable agriculture, andmore attention should be paid to these areas to prevent future problems.

  4. Geostatistical analysis of variations in soil salinity in a typical irrigation area in Xinjiang, northwest China

    Institute of Scientific and Technical Information of China (English)

    Mamattursun Eziz; Mihrigul Anwar; XinGuo Li

    2016-01-01

    Characterizing spatial and temporal variability of soil salinity is tremendously important for a variety of agronomic and environmental concerns in arid irrigation areas. This paper reviews the characteristics and spatial and temporal variations of soil salinization in the Ili River Irrigation Area by applying a geostatistical approach. Results showed that: (1) the soil salinity varied widely, with maximum value of 28.10 g/kg and minimum value of 0.10 g/kg, and was distributed mainly at the surface soil layer. Anions were mainly SO42− and Cl−, while cations were mainly Na+and Ca2+; (2) the abundance of salinity of the root zone soil layer for different land use types was in the following order: grassland > cropland > forestland. The abundance of salinity of root zone soil layers for different periods was in the following order: March > June > Sep-tember; (3) the spherical model was the most suitable variogram model to describe the salinity of the 0–3 cm and 3–20 cm soil layers in March and June, and the 3–20 cm soil layer in September, while the exponential model was the most suitable variogram model to describe the salinity of the 0–3 cm soil layer in September. Relatively strong spatial and temporal structure existed for soil salinity due to lower nugget effects; and (4) the maps of kriged soil salinity showed that higher soil salinity was distributed in the central parts of the study area and lower soil salinity was distributed in the marginal parts. Soil salinity tended to increase from the marginal parts to the central parts across the study area. Applying the kriging method is very helpful in detecting the problematic areas and is a good tool for soil resources management. Managing efforts on the appropriate use of soil and water resources in such areas is very important for sustainable agriculture, and more attention should be paid to these areas to prevent future problems.

  5. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  6. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    Science.gov (United States)

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.

  7. Geostatistical upscaling of rain gauge data to support uncertainty analysis of lumped urban hydrological models

    OpenAIRE

    Muthusamy, Manoranjan; Schellart, Alma; TAIT, Simon; B. M. Heuvelink, Gerard

    2016-01-01

    In this study we develop a method to estimate the spatially averaged rainfall intensity together with associated level of uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 x 200 m2 urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of...

  8. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  9. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  10. IN SITU NON-INVASIVE SOIL CARBON ANALYSIS: SAMPLE SIZE AND GEOSTATISTICAL CONSIDERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2005-04-01

    I discuss a new approach for quantitative carbon analysis in soil based on INS. Although this INS method is not simple, it offers critical advantages not available with other newly emerging modalities. The key advantages of the INS system include the following: (1) It is a non-destructive method, i.e., no samples of any kind are taken. A neutron generator placed above the ground irradiates the soil, stimulating carbon characteristic gamma-ray emission that is counted by a detection system also placed above the ground. (2) The INS system can undertake multielemental analysis, so expanding its usefulness. (3) It can be used either in static or scanning modes. (4) The volume sampled by the INS method is large with a large footprint; when operating in a scanning mode, the sampled volume is continuous. (5) Except for a moderate initial cost of about $100,000 for the system, no additional expenses are required for its operation over two to three years after which a NG has to be replenished with a new tube at an approximate cost of $10,000, this regardless of the number of sites analyzed. In light of these characteristics, the INS system appears invaluable for monitoring changes in the carbon content in the field. For this purpose no calibration is required; by establishing a carbon index, changes in carbon yield can be followed with time in exactly the same location, thus giving a percent change. On the other hand, with calibration, it can be used to determine the carbon stock in the ground, thus estimating the soil's carbon inventory. However, this requires revising the standard practices for deciding upon the number of sites required to attain a given confidence level, in particular for the purposes of upward scaling. Then, geostatistical considerations should be incorporated in considering properly the averaging effects of the large volumes sampled by the INS system that would require revising standard practices in the field for determining the number of spots to

  11. Analysis of Geostatistical and Deterministic Techniques in the Spatial Variation of Groundwater Depth in the North-western part of Bangladesh

    Directory of Open Access Journals (Sweden)

    Ibrahim Hassan

    2016-06-01

    Full Text Available Various geostatistical and deterministic techniques were used to analyse the spatial variations of groundwater depths. Two different geostatistical methods of ordinary kriging and co-kriging with four semivariogram models, spherical, exponential, circular, Gaussian, and four deterministic methods which are inverse distance weighted (IDW, global polynomial interpolation (GPI, local Polynomial Interpolation (LPI, radial basis function (RBF were used for the estimation of groundwater depths. The study area is in the three Northwestern districts of Bangladesh. Groundwater depth data were recorded from 132 observation wells in the study area over a period of 6 years (2004 to 2009 was considered for the analysis. The spatial interpolation of groundwater depths was then performed using the best-fit model which is geostatistical model selected by comparing the observed RMSE values predicted by the geostatistical and deterministic models and the empirical semi-variogram models. Out of the four semi-variogram models, spherical semi-variogram with cokriging model was considered as the best fitted model for the study area. Result of sensitivity analysis conducted on the input parameters shows that inputs have a strong influence on groundwater levels and the statistical indicators of RMSE and ME suggest that the Co-kriging work best with percolation in predicting the average groundwater table of the study area.

  12. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  13. Geostatistical Analysis of Spatial Variability of Mineral Abundance and Kd in Frenchman Flat, NTS, Alluvium

    Energy Technology Data Exchange (ETDEWEB)

    Carle, S F; Zavarin, M; Pawloski, G A

    2002-11-01

    LLNL hydrologic source term modeling at the Cambric site (Pawloski et al., 2000) showed that retardation of radionuclide transport is sensitive to the distribution and amount of radionuclide sorbing minerals. While all mineralogic information available near the Cambric site was used in these early simulations (11 mineral abundance analyses from UE-5n and 9 from RNM-l), these older data sets were qualitative in nature, with detection limits too high to accurately measure many of the important radionuclide sorbing minerals (e.g. iron oxide). Also, the sparse nature of the mineral abundance data permitted only a hypothetical description of the spatial distribution of radionuclide sorbing minerals. Yet, the modeling results predicted that the spatial distribution of sorbing minerals would strongly affect radionuclide transport. Clearly, additional data are needed to improve understanding of mineral abundances and their spatial distributions if model predictions in Frenchman Flat are to be defensible. This report evaluates new high-resolution quantitative X-Ray Diffraction (XRD) data on mineral distributions and their abundances from core samples recently collected from drill hole ER-5-4. The total of 94 samples from ER-5-4 were collected at various spacings to enable evaluation of spatial variability at a variety of spatial scales as small as 0.3 meters and up to hundreds of meters. Additional XRD analyses obtained from drillholes UE-Sn, ER-5-3, and U-11g-1 are used to augment evaluation of vertical spatial variability and permit some evaluation of lateral spatial variability. A total of 163 samples are evaluated. The overall goal of this study is to understand and characterize the spatial variation of sorbing minerals in Frenchman Flat alluvium using geostatistical techniques, with consideration for the potential impact on reactive transport of radionuclides. To achieve this goal requires an effort to ensure that plausible geostatistical models are used to

  14. Factorial kriging analysis - a geostatistical approach to improve reservoir characterization with seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Evaldo Cesario; Johann, Paulo R. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Remacre, Armando Zaupa [Universidade Estadual de Campinas, SP (Brazil)

    1999-07-01

    In this work the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes as a secondary variable in Kriging system is discussed. Results prove that Factorial Kriging is an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)

  15. A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data

    KAUST Repository

    Liang, Faming

    2013-03-01

    The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  16. Geostatistical Analysis of Population Density and the Change of Land Cover and Land Use in the Komadugu-Yobe River Basin in Nigeria

    Science.gov (United States)

    Tobar, I.; Lee, J.; Black, F. W.; Babamaaji, R. A.

    2014-12-01

    The Komadugu-Yobe River Basin in northeastern Nigeria is an important tributary of Lake Chad and has experienced significant changes in population density and land cover in recent decades. The present study focuses on the application of geostatistical methods to examine the land cover and population density dynamics in the river basin. The geostatistical methods include spatial autocorrelation, overlapping neighborhood statistics with Pearson's correlation coefficient, Moran's I index analysis, and indicator variogram analysis with rose diagram. The land cover and land use maps were constructed from USGS Landsat images and Globcover images from the European Space Agency. The target years of the analysis are 1970, 1986, 2000, 2005, and 2009. The calculation of net changes in land cover indicates significant variation in the changes of rainfed cropland, mosaic cropland, and grassland. Spatial autocorrelation analysis and Moran I index analysis showed that the distribution of land cover is highly clustered. A new GIS geostatistical tool was designed to calculate the overlapping neighborhood statistics with Pearson's correlation coefficient between the land use/land cover and population density datasets. The 10x10 neighborhood cell unit showed a clear correlation between the variables in certain zones of the study area. The ranges calculated from the indicator variograms of land use and land cover and population density showed that the cropland and sparse vegetation are most closely related to the spatial change of population density.

  17. Geostatistical Modeling of Pore Velocity

    Energy Technology Data Exchange (ETDEWEB)

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses.

  18. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  19. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  20. Geostatistical Analysis of Tritium, 3H/3He Age and Noble Gas Derived Parameters in California Groundwater

    Science.gov (United States)

    Visser, A.; Singleton, M. J.; Moran, J. E.; Fram, M. S.; Kulongoski, J. T.; Esser, B. K.

    2014-12-01

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge, are revealed in a spatial geostatistical analysis of the data set of tritium, dissolved noble gas and helium isotope analyses collected for the California State Water Resources Control Board's Groundwater Ambient Monitoring and Assessment (GAMA) and California Aquifer Susceptibility (CAS) programs. Over 4,000 tritium and noble gas analyses are available from wells across California. 25% of the analyzed samples contained less than 1 pCi/L indicating recharge occurred before 1950. The correlation length of tritium concentration is 120 km. Nearly 50% of the wells show a significant component of terrigenic helium. Over 50% of these samples show a terrigenic helium isotope ratio (Rter) that is significantly higher than the radiogenic helium isotope ratio (Rrad = 2×10-8). Rter values of more than three times the atmospheric isotope ratio (Ra = 1.384×10-6) are associated with known faults and volcanic provinces in Northern California. In the Central Valley, Rter varies from radiogenic to 2.25 Ra, complicating 3H/3He dating. The Rter was mapped by kriging, showing a correlation length of less than 50 km. The local predicted Rter was used to separate tritiogenic from atmospheric and terrigenic 3He. Regional groundwater recharge areas, indicated by young groundwater ages, are located in the southern Santa Clara Basin and in the upper LA basin and in the eastern San Joaquin Valley and along unlined canals carrying Colorado River water. Recharge in California is dominated by agricultural return flows, river recharge and managed aquifer recharge rather than precipitation excess. Combined application of noble gases and other groundwater tracers reveal the impact of engineered groundwater recharge and prove invaluable for the study of complex groundwater systems. This work was performed under the

  1. Satellite Magnetic Residuals Investigated With Geostatistical Methods

    DEFF Research Database (Denmark)

    Fox Maule, Chaterine; Mosegaard, Klaus; Olsen, Nils

    2005-01-01

    (which consists of measurement errors and unmodeled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyze the residuals of the Oersted (09d/04) field model (www.dsri.dk/Oersted/Field models/IGRF 2005 candidates/), which is based......The geomagnetic field varies on a variety of time- and length scales, which are only rudimentarily considered in most present field models. The part of the observed field that cannot be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty...... on 5 years of Ørsted and CHAMP data and includes secular variation and acceleration, as well as low-degree external (magnetospheric) and induced fields. The analysis is done in order to find the statistical behavior of the space-time structure of the residuals, as a proxy for the data covariances. Once...

  2. Geostatistical Methods in R

    Directory of Open Access Journals (Sweden)

    Adéla Volfová

    2012-10-01

    Full Text Available Geostatistics is a scientific field which provides methods for processing spatial data.  In our project, geostatistics is used as a tool for describing spatial continuity and making predictions of some natural phenomena. An open source statistical project called R is used for all calculations. Listeners will be provided with a brief introduction to R and its geostatistical packages and basic principles of kriging and cokriging methods. Heavy mathematical background is omitted due to its complexity. In the second part of the presentation, several examples are shown of how to make a prediction in the whole area of interest where observations were made in just a few points. Results of these methods are compared.

  3. Three-Dimensional Geostatistical Analysis of Rock Fracture Roughness and Its Degradation with Shearing

    Directory of Open Access Journals (Sweden)

    Nima Babanouri

    2013-12-01

    Full Text Available Three-dimensional surface geometry of rock discontinuities and its evolution with shearing are of great importance in understanding the deformability and hydro-mechanical behavior of rock masses. In the present research, surfaces of three natural rock fractures were digitized and studied before and after the direct shear test. The variography analysis of the surfaces indicated a strong non-linear trend in the data. Therefore, the spatial variability of rock fracture surfaces was decomposed to one deterministic component characterized by a base polynomial function, and one stochastic component described by the variogram of residuals. By using an image-processing technique, 343 damaged zones with different sizes, shapes, initial roughness characteristics, local stress fields, and asperity strength values were spatially located and clustered. In order to characterize the overall spatial structure of the degraded zones, the concept of ‘pseudo-zonal variogram’ was introduced. The results showed that the spatial continuity at the damage locations increased due to asperity degradation. The increase in the variogram range was anisotropic and tended to be higher in the shear direction; thus, the direction of maximum continuity rotated towards the shear direction. Finally, the regression-kriging method was used to reconstruct the morphology of the intact surfaces and degraded areas. The cross-validation error of interpolation for the damaged zones was found smaller than that obtained for the intact surface.

  4. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    Science.gov (United States)

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  5. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...... been driven by applied work. After laying out CA's standard practices of data treatment and analysis, this article takes up the role of comparison as a fundamental analytical strategy and reviews recent developments into cross-linguistic and cross-cultural directions. The remaining article focuses...... on learning and development. In conclusion, we address some emerging themes in the relationship of CA and applied linguistics, including the role of multilingualism, standard social science methods as research objects, CA's potential for direct social intervention, and increasing efforts to complement CA...

  6. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  7. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  8. Geostatistical analysis of disease data: accounting for spatial support and population density in the isopleth mapping of cancer mortality risk using area-to-point Poisson kriging

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2006-11-01

    Full Text Available Abstract Background Geostatistical techniques that account for spatially varying population sizes and spatial patterns in the filtering of choropleth maps of cancer mortality were recently developed. Their implementation was facilitated by the initial assumption that all geographical units are the same size and shape, which allowed the use of geographic centroids in semivariogram estimation and kriging. Another implicit assumption was that the population at risk is uniformly distributed within each unit. This paper presents a generalization of Poisson kriging whereby the size and shape of administrative units, as well as the population density, is incorporated into the filtering of noisy mortality rates and the creation of isopleth risk maps. An innovative procedure to infer the point-support semivariogram of the risk from aggregated rates (i.e. areal data is also proposed. Results The novel methodology is applied to age-adjusted lung and cervix cancer mortality rates recorded for white females in two contrasted county geographies: 1 state of Indiana that consists of 92 counties of fairly similar size and shape, and 2 four states in the Western US (Arizona, California, Nevada and Utah forming a set of 118 counties that are vastly different geographical units. Area-to-point (ATP Poisson kriging produces risk surfaces that are less smooth than the maps created by a naïve point kriging of empirical Bayesian smoothed rates. The coherence constraint of ATP kriging also ensures that the population-weighted average of risk estimates within each geographical unit equals the areal data for this unit. Simulation studies showed that the new approach yields more accurate predictions and confidence intervals than point kriging of areal data where all counties are simply collapsed into their respective polygon centroids. Its benefit over point kriging increases as the county geography becomes more heterogeneous. Conclusion A major limitation of choropleth

  9. 7th International Geostatistics Congress

    CERN Document Server

    Deutsch, Clayton

    2005-01-01

    The conference proceedings consist of approximately 120 technical papers presented at the Seventh International Geostatistics Congress held in Banff, Alberta, Canada in 2004. All the papers were reviewed by an international panel of leading geostatisticians. The five major sections are: theory, mining, petroleum, environmental and other applications. The first section showcases new and innovative ideas in the theoretical development of geostatistics as a whole; these ideas will have large impact on (1) the directions of future geostatistical research, and (2) the conventional approaches to heterogeneity modelling in a wide range of natural resource industries. The next four sections are focused on applications and innovations relating to the use of geostatistics in specific industries. Historically, mining, petroleum and environmental industries have embraced the use of geostatistics for uncertainty characterization, so these three industries are identified as major application areas. The last section is open...

  10. rasterEngine: an easy-to-use R function for applying complex geostatistical models to raster datasets in a parallel computing environment

    Science.gov (United States)

    Greenberg, J. A.

    2013-12-01

    As geospatial analyses progress in tandem with increasing availability of large complex geographic data sets and high performance computing (HPC), there is an increasing gap in the ability of end-user tools to take advantage of these advances. Specifically, the practical implementation of complex statistical models on large gridded geographic datasets (e.g. remote sensing analysis, species distribution mapping, topographic transformations, and local neighborhood analyses) currently requires a significant knowledge base. A user must be proficient in the chosen model as well as the nuances of scientific programming, raster data models, memory management, parallel computing, and system design. This is further complicated by the fact that many of the cutting-edge analytical tools were developed for non-geospatial datasets and are not part of standard GIS packages, but are available in scientific computing languages such as R and MATLAB. We present a computing function 'rasterEngine' written in the R scientific computing language and part of the CRAN package 'spatial.tools' with these challenges in mind. The goal of rasterEngine is to allow a user to quickly develop and apply analytical models within the R computing environment to arbitrarily large gridded datasets, taking advantage of available parallel computing resources, and without requiring a deep understanding of HPC and raster data models. We provide several examples of rasterEngine being used to solve common grid based analyses, including remote sensing image analyses, topographic transformations, and species distribution modeling. With each example, the parallel processing performance results are presented.

  11. Modelling infiltration and geostatistical analysis of spatial variability of sorptivity and transmissivity in a flood spreading area

    Energy Technology Data Exchange (ETDEWEB)

    Haghighi-Fashi, F.; Sharifi, F.; Kamali, K.

    2014-06-01

    Knowledge of infiltration characteristics is useful in hydrological studies of agricultural soils. Soil hydraulic parameters such as steady infiltration rate, sorptivity, and transmissivity can exhibit appreciable spatial variability. The main objectives of this study were to examine several mathematical models of infiltration and to analyze the spatial variability of observed final infiltration rate, estimated sorptivity and estimated transmissivity in flood spreading and control areas in Ilam province, Iran. The suitability of geostatistics to describe such spatial variability was assessed using data from 30 infiltration measurements sampled along three lines. The Horton model provided the most accurate simulation of infiltration considering all measurements and the Philips two-term model provided less accurate simulation. A comparison of the measured values and the estimated final infiltration rates showed that the Kostiakov- Lewis, Kostiakov, and SCS models could not estimate the final infiltration rate as well as Horton model. Estimated sorptivity and transmissivity parameters of the Philips two-term model and final infiltration rate had spatial structure, and were considered to be structural variables over the transect pattern. The Gaussian model provided the best-fit theoretical variogram for these three parameters. Variogram values ranged from 99 and 88 m for sorptivity and final infiltration rate to 686 (spherical) and 384 m (Gaussian) for transmissivity. Sorptivity, transmissivity and final infiltration attributes showed a high degree of spatial dependence, being 0.99, 0.81 and 1, respectively. Results showed that kriging could be used to predict the studied parameters in the study area. (Author)

  12. Catchments as space-time filters – a joint spatio-temporal geostatistical analysis of runoff and precipitation

    Directory of Open Access Journals (Sweden)

    J. O. Skøien

    2006-06-01

    Full Text Available In this paper catchments are conceptualised as linear space-time filters. Catchment area A is interpreted as the spatial support and the catchment response time Tis interpreted as the temporal support of the runoff measurements. These two supports are related by T~Aκ which embodies the space-time connections of the rainfall-runoff process from a geostatistical perspective. To test the framework, spatio-temporal variograms are estimated from about 30 years of quarter hourly precipitation and runoff data from about 500 catchments in Austria. In a first step, spatio-temporal variogram models are fitted to the sample variograms for three catchment size classes independently. In a second step, variograms are fitted to all three catchment size classes jointly by estimating the parameters of a point/instantaneous spatio-temporal variogram model and aggregating (regularising it to the spatial and temporal scales of the catchments. The exponential, Cressie-Huang and product-sum variogram models give good fits to the sample variograms of runoff with dimensionless errors ranging from 0.02 to 0.03, and the model parameters are plausible. This indicates that the first order effects of the spatio-temporal variability of runoff are indeed captured by conceptualising catchments as linear space-time filters. The scaling exponent κ is found to vary between 0.3 and 0.4 for different variogram models.

  13. Catchments as space-time filters – a joint spatio-temporal geostatistical analysis of runoff and precipitation

    Directory of Open Access Journals (Sweden)

    J. O. Skøien

    2006-01-01

    Full Text Available In this paper catchments are conceptualised as linear space-time filters. Catchment area A is interpreted as the spatial support and the catchment response time T is interpreted as the temporal support of the runoff measurements. These two supports are related by T~Aκ which embodies the space-time connections of the rainfall-runoff process from a geostatistical perspective. To test the framework, spatio-temporal variograms are estimated from about 30 years of quarter hourly precipitation and runoff data from about 500 catchments in Austria. In a first step, spatio-temporal variogram models are fitted to the sample variograms for three catchment size classes independently. In a second step, variograms are fitted to all three catchment size classes jointly by estimating the parameters of a point/instantaneous spatio-temporal variogram model and aggregating (regularising it to the spatial and temporal scales of the catchments. The exponential, Cressie-Huang and product-sum variogram models give good fits to the sample variograms of runoff with dimensionless errors ranging from 0.02 to 0.03, and the model parameters are plausible. This indicates that the first order effects of the spatio-temporal variability of runoff are indeed captured by conceptualising catchments as linear space-time filters. The scaling exponent κ is found to vary between 0.3 and 0.4 for different variogram models.

  14. Sediment distribution pattern mapped from the combination of objective analysis and geostatistics in the large shallow Taihu Lake, China

    Institute of Scientific and Technical Information of China (English)

    LUO Lian-cong; QIN Bo-qiang; ZHU Guang-wei

    2004-01-01

    Investigation was made into sediment depth at 723 irregularly scattered measurement points which cover all the regions in Taihu Lake, China. The combination of successive correction scheme and geostatistical method was used to get all the values of recent sediment thickness at the 69×69 grids in the whole lake. The results showed that there is the significant difference in sediment depth between the eastern area and the western region, and most of the sediments are located in the western shore-line and northern regimes but just a little in the center and eastern parts. The notable exception is the patch between the center and Xishan Island where the maximum sediment depth is more than 4.0 m. This sediment distribution pattern is more than likely related to the current circulation pattern induced by the prevailing wind-forcing in Taihu Lake. The numerical simulation of hydrodynamics can strong support the conclusion. Sediment effects on water quality was also studied and the results showed that the concentrations of TP, TN and SS in the western part are obviously larger than those in the eastern regime, which suggested that more nutrients can be released from thicker sediment areas.

  15. Geostatistics for fracture characterization

    International Nuclear Information System (INIS)

    As the critical role of fractures has become more apparent in fluid flow and contaminant transport studies, the characterization of fracture networks has received considerable attention in a wide variety of applications such as nuclear waste repository design. The application of geostatistics to fracture characterization has traditionally involved modelling fractures as thin disks; assumptions about the frequency, orientation, length and width of these disks allow the construction of a 3D model of the fracture network. This paper examines alternatives whose statistical parameters are more relevant for contaminant transport studies and are also easier to infer and validate. A new algorithm for conditional simulation is presented, one that is able to honor multipoint statistics through annealing. By honoring statistics that capture with two-point spatial convariances, this algorithm offers an important new tool not only for the specific problem of fracture characterization but also for the more general problem of spatial simulation

  16. 柑橘全爪螨种群空间格局的地学统计学分析%Geostatistic analysis of spatial pattern of the citrus red mite, Panonychus cirri (McGregor) ( Acarina: Tetranychidae) in citrus orchard

    Institute of Scientific and Technical Information of China (English)

    李志强; 梁广文; 岑伊静

    2008-01-01

    The citrus red mite, Panonychus citri (McGregor), is a key pest of citrus. Geostatistic method was applied to study the spatial pattern of citrus red mite population, in citrus orchard by the spatial analysis software Variowin 2.1, The results indicated that the spatial pattern of citrus red mite population can be described by geostatistic method, and the semivariogram of citrus red mite mainly fitted the gauss models with the ranges of 1.1-21.0 m. Citrus red mite population showed an aggregative distribution, and the aggregating intensities were relatively strong in March, August and September. The spatial pattern dynamics showed that two occurrence peaks of citrus red mite population occurred in April and October, specially in October, citrus red mite popula-tion rapidly diffused. March and September were two crucial stages of monitoring and treatment for citrus red mite.%应用地学统计学方法分析了柑橘园主要害螨柑橘全爪螨Panonychus citri(McGregor)种群的空间格局及其动态.结果表明,柑橘全爪螨种群具有空间相关性,变程介于1.10~21.0 m,其半变异函数主要符合高斯模型,表现为聚集分布,其中3月、8月和9月的聚集强度较大;种群空间格局动态显示,4月、10月为该种群的两个发生高峰期,柑橘全爪螨种群数量快速上升扩散.地学统计学方法能够应用于柑橘全爪螨种群的空间格局分析,并有助于对该害螨进行发生预测与控制处理.

  17. Geostatistical analysis to identify hydrogeochemical processes in complex aquifers: a case study (Aguadulce unit, Almeria, SE Spain).

    Science.gov (United States)

    Daniele, Linda; Pulido Bosch, Antonio; Vallejos, Angela; Molina, Luis

    2008-06-01

    The Aguadulce aquifer unit in southeastern Spain is a complex hydrogeological system because of the varied lithology of the aquifer strata and the variability of the processes that can take place within the unit. Factorial analysis of the data allowed the number of variables to be reduced to 3 factors, which were found to be related to such physico-chemical processes as marine intrusion and leaching of saline deposits. Variographic analysis was applied to these factors, culminating in a study of spatial distribution using ordinary kriging. Mapping of the factors allowed rapid differentiation of some of the processes that affect the waters of the Gador carbonate aquifer within the Aguadulce unit, without the need to recur to purely hydrogeochemical techniques. The results indicate the existence of several factors related to salinity: marine intrusion, paleowaters, and/or leaching of marls and evaporitic deposits. The techniques employed are effective, and the results conform to those obtained using hydrogeochemical methods (vertical records of conductivity and temperature, ion ratios, and others). The findings of this study confirm that the application of such analytical methods can provide a useful assessment of factors affecting groundwater composition. PMID:18686503

  18. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    International Nuclear Information System (INIS)

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ''Geostatistics Test Problem'' is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1

  19. Conversation Analysis and Applied Linguistics.

    Science.gov (United States)

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  20. Based on Geostatistical Analyst module analysis the nutrient spatial variation characteristics in karst region soil%基于Geostatistical Analyst的喀斯特地区土壤养分空间变异研究

    Institute of Scientific and Technical Information of China (English)

    陆锦; 任晓冬; 刘洪云

    2015-01-01

    基于ARCGIS平台Geostatistical Analyst模块对喀斯特地区土壤养分空间变异特征进行研究,以贵州省黔西县韦寨村为研究区,首先对研究区内耕地土壤进行取样,深度为0~ 20cm,并将研究区划分为两片,采样方式分别为25m* 25m和50m* 50m.通过实验得出土壤中全氮、有效磷、速效钾、有机质含量和pH值.采用GIS地统计分析和常规统计方法对土壤pH值及养分(全氮、有效磷、速效钾、有机质)的空间变异分析和合理采样数目研究.结果表明,研究区内全氮、有效磷、速效钾、pH、有机质均呈对数正态分布;全氮、有效磷、速效钾、pH、有机质均为中等变异强度;研究区内最优采样方式为50m* 50m.

  1. Spatial Downscaling of TRMM Precipitation Using Geostatistics and Fine Scale Environmental Variables

    Directory of Open Access Journals (Sweden)

    No-Wook Park

    2013-01-01

    Full Text Available A geostatistical downscaling scheme is presented and can generate fine scale precipitation information from coarse scale Tropical Rainfall Measuring Mission (TRMM data by incorporating auxiliary fine scale environmental variables. Within the geostatistical framework, the TRMM precipitation data are first decomposed into trend and residual components. Quantitative relationships between coarse scale TRMM data and environmental variables are then estimated via regression analysis and used to derive trend components at a fine scale. Next, the residual components, which are the differences between the trend components and the original TRMM data, are then downscaled at a target fine scale via area-to-point kriging. The trend and residual components are finally added to generate fine scale precipitation estimates. Stochastic simulation is also applied to the residual components in order to generate multiple alternative realizations and to compute uncertainty measures. From an experiment using a digital elevation model (DEM and normalized difference vegetation index (NDVI, the geostatistical downscaling scheme generated the downscaling results that reflected detailed characteristics with better predictive performance, when compared with downscaling without the environmental variables. Multiple realizations and uncertainty measures from simulation also provided useful information for interpretations and further environmental modeling.

  2. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  3. Geostatistical and stratigraphic analysis of deltaic reservoirs from the Reconcavo Basin, Brazil; Analise estratigrafica e geoestatistica de reservatorios deltaicos da Bacia do Reconcavo (BA)

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Carlos Moreira

    1997-07-01

    This study presents the characterization of the external geometry of deltaic oil reservoirs, including the description of their areal distribution using geo statistic tools, such as variography and kriging. A high-resolution stratigraphic study was developed over a 25 km{sup 2} area, by using data from 276 closely-spaced wells of an oil-producer field from the Reconcavo Basin, northeastern Brazil. The studied succession records the progressive lacustrine transgression of a deltaic environment. Core data and stratigraphic cross sections suggest that the oil reservoirs are mostly amalgamated, delta-front lobes, and subordinately, crevasse deposits. Some important geometrical elements were recognized by the detailed variographic analysis developed for each stratigraphic unit (zone). The average width for the groups of deltaic lobes of one zone was measured from the variographic feature informally named as hole effect. This procedure was not possible for the other zones due to the intense lateral amalgamation of sandstones, indicated by many variographic nested structures. Net sand krigged maps for the main zones suggest a NNW-SSE orientation for the deltaic lobes, as also their common amalgamation and compensation arrangements. High-resolution stratigraphic analyses should include a more regional characterization of the depositional system that comprises the studied succession. On the other hand, geostatistical studies should be developed only after the recognition of the depositional processes acting in the study area and the geological meaning of the variable to be treated, including its spatial variability scales as a function of sand body thickness, orientation and amalgamation. (author)

  4. Multivariate Geostatistical Analysis of Uncertainty for the Hydrodynamic Model of a Geological Trap for Carbon Dioxide Storage. Case study: Multilayered Geological Structure Vest Valcele, ROMANIA

    Science.gov (United States)

    Scradeanu, D.; Pagnejer, M.

    2012-04-01

    The purpose of the works is to evaluate the uncertainty of the hydrodynamic model for a multilayered geological structure, a potential trap for carbon dioxide storage. The hydrodynamic model is based on a conceptual model of the multilayered hydrostructure with three components: 1) spatial model; 2) parametric model and 3) energy model. The necessary data to achieve the three components of the conceptual model are obtained from: 240 boreholes explored by geophysical logging and seismic investigation, for the first two components, and an experimental water injection test for the last one. The hydrodinamic model is a finite difference numerical model based on a 3D stratigraphic model with nine stratigraphic units (Badenian and Oligocene) and a 3D multiparameter model (porosity, permeability, hydraulic conductivity, storage coefficient, leakage etc.). The uncertainty of the two 3D models was evaluated using multivariate geostatistical tools: a)cross-semivariogram for structural analysis, especially the study of anisotropy and b)cokriging to reduce estimation variances in a specific situation where is a cross-correlation between a variable and one or more variables that are undersampled. It has been identified important differences between univariate and bivariate anisotropy. The minimised uncertainty of the parametric model (by cokriging) was transferred to hydrodynamic model. The uncertainty distribution of the pressures generated by the water injection test has been additional filtered by the sensitivity of the numerical model. The obtained relative errors of the pressure distribution in the hydrodynamic model are 15-20%. The scientific research was performed in the frame of the European FP7 project "A multiple space and time scale approach for the quantification of deep saline formation for CO2 storage(MUSTANG)".

  5. Geostatistical and Statistical Classification of Sea-Ice Properties and Provinces from SAR Data

    Directory of Open Access Journals (Sweden)

    Ute C. Herzfeld

    2016-07-01

    Full Text Available Recent drastic reductions in the Arctic sea-ice cover have raised an interest in understanding the role of sea ice in the global system as well as pointed out a need to understand the physical processes that lead to such changes. Satellite remote-sensing data provide important information about remote ice areas, and Synthetic Aperture Radar (SAR data have the advantages of penetration of the omnipresent cloud cover and of high spatial resolution. A challenge addressed in this paper is how to extract information on sea-ice types and sea-ice processes from SAR data. We introduce, validate and apply geostatistical and statistical approaches to automated classification of sea ice from SAR data, to be used as individual tools for mapping sea-ice properties and provinces or in combination. A key concept of the geostatistical classification method is the analysis of spatial surface structures and their anisotropies, more generally, of spatial surface roughness, at variable, intermediate-sized scales. The geostatistical approach utilizes vario parameters extracted from directional vario functions, the parameters can be mapped or combined into feature vectors for classification. The method is flexible with respect to window sizes and parameter types and detects anisotropies. In two applications to RADARSAT and ERS-2 SAR data from the area near Point Barrow, Alaska, it is demonstrated that vario-parameter maps may be utilized to distinguish regions of different sea-ice characteristics in the Beaufort Sea, the Chukchi Sea and in Elson Lagoon. In a third and a fourth case study the analysis is taken further by utilizing multi-parameter feature vectors as inputs for unsupervised and supervised statistical classification. Field measurements and high-resolution aerial observations serve as basis for validation of the geostatistical-statistical classification methods. A combination of supervised classification and vario-parameter mapping yields best results

  6. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  7. 因子分析与地统计学在化探数据分析中的应用%Application of factor analysis and geostatistics to geochemical data analysis

    Institute of Scientific and Technical Information of China (English)

    祁轶宏; 李晓晖; 霍立新

    2013-01-01

    Based on soil geochemical data of the Tongling mining camp,this paper obtained the information of a number of major factors from the geochemical data using factor analysis,and made spatial variation analysis for each major fac-tor and interpolation study using geostatistic method,indicating that each major factor acquired from factor analysis cor-responds to different ore-forming information,and combined use of factor analysis,geostatistic analysis and interpola-tion method can better display the spatial distribution trend of each major factor score and its correlation with known mineralization information,further serving metallogenic prognosis and mineral exploration.%  以铜陵矿集区土壤勘查地球化学数据为实例,应用因子分析方法获取了地球化学数据中的多个主因子信息,并利用地统计学方法开展了各主因子的空间变异分析和插值研究。研究结果显示,因子分析得到的各主因子对应于不同的成矿信息,将因子分析与地统计学分析和插值方法相结合,可以更好的展现各主因子得分的空间分布趋势以及与已知成矿信息的关联程度,进而服务于成矿预测和找矿勘探工作。

  8. GEOSTATISTICAL ANALYSIS OF SURFACE TEMPERATURE AND IN-SITU SOIL MOISTURE USING LST TIME-SERIES FROM MODIS

    OpenAIRE

    Sohrabinia, M.; W. Rack; P. Zawar-Reza

    2012-01-01

    The objective of this analysis is to provide a quantitative estimate of the fluctuations of land surface temperature (LST) with varying near surface soil moisture (SM) on different land-cover (LC) types. The study area is located in the Canterbury Plains in the South Island of New Zealand. Time series of LST from the MODerate resolution Imaging Spectro-radiometer (MODIS) have been analysed statistically to study the relationship between the surface skin temperature and near-surface S...

  9. Geostatistical enhancement of european hydrological predictions

    Science.gov (United States)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  10. Geostatistical noise filtering of geophysical images : application to unexploded ordnance (UXO) sites.

    Energy Technology Data Exchange (ETDEWEB)

    Saito, Hirotaka; McKenna, Sean Andrew; Coburn, Timothy C. (Abilene Christian University, Abilene, TX,)

    2004-07-01

    Geostatistical and non-geostatistical noise filtering methodologies, factorial kriging and a low-pass filter, and a region growing method are applied to analytic signal magnetometer images at two UXO contaminated sites to delineate UXO target areas. Overall delineation performance is improved by removing background noise. Factorial kriging slightly outperforms the low-pass filter but there is no distinct difference between them in terms of finding anomalies of interest.

  11. Concept analysis of culture applied to nursing.

    Science.gov (United States)

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  12. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  13. Book review: Spatial statistics and geostatistics

    OpenAIRE

    Clift, Hamish

    2013-01-01

    "Spatial Statistics and Geostatistics." Yongwan Chun and Daniel A. Griffith. SAGE. January 2013. --- This book aims to explain and demonstrate techniques in spatial sampling, local statistics, and advanced topics including Bayesian methods, Monte Carlo simulation, error and uncertainty. Spatial Statistics and Geostatistics is highly recommended to researchers in geography, environmental science, health and epidemiology, population and demography, and planning, writes Hamish Clift.

  14. The role of geostatistics in medical geology

    Science.gov (United States)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say prostate cancer mortality, including a study of populations residing in Utah. The information available for the present ecological study (i.e. analysis of aggregated health outcomes) consist of: 1) 9,188 arsenic concentrations measured at 8,212 different private wells that were sampled between 1993 and 2002, 2) prostate cancer incidence recorded at the township level over the period 1985-2002, and 3) block-group population density that served as proxy for

  15. Source Apportionment of Heavy Metals in Soils Using Multivariate Statistics and Geostatistics

    Institute of Scientific and Technical Information of China (English)

    QU Ming-Kai; LI Wei-Dong; ZHANG Chuan-Rong; WANG Shan-Qin; YANG Yong; HE Li-Yuan

    2013-01-01

    The main objectives of this study were to introduce an integrated method for effectively identifying soil heavy metal pollution sources and apportioning their contributions,and apply it to a case study.The method combines the principal component analysis/absolute principal component scores (PCA/APCS) receptor model and geostatistics.The case study was conducted in an area of 31 km2 in the urban-rural transition zone of Wuhan,a metropolis of central China.124 topsoil samples were collected for measuring the concentrations of eight heavy metal elements (Mn,Cu,Zn,Pb,Cd,Cr,Ni and Co).PCA results revealed that three major factors were responsible for soil heavy metal pollution,which were initially identified as "steel production","agronomic input" and "coal consumption".The APCS technique,combined with multiple linear regression analysis,was then applied for source apportionment.Steel production appeared to be the main source for Ni,Co,Cd,Zn and Mn,agronomic input for Cu,and coal consumption for Pb and Cr.Geostatistical interpolation using ordinary kriging was finally used to map the spatial distributions of the contributions of pollution sources and further confirm the result interpretations.The introduced method appears to be an effective tool in soil pollution source apportionment and identification,and might provide valuable reference information for pollution control and environmental management.

  16. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  17. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  18. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  19. Applied Data Analysis in Energy Monitoring System

    Directory of Open Access Journals (Sweden)

    Kychkin А.V.

    2016-08-01

    Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».

  20. 2nd European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Soares, Amílcar; Froidevaux, Roland

    1999-01-01

    The Second European Conference on Geostatistics for Environmental Ap­ plications took place in Valencia, November 18-20, 1998. Two years have past from the first meeting in Lisbon and the geostatistical community has kept active in the environmental field. In these days of congress inflation, we feel that continuity can only be achieved by ensuring quality in the papers. For this reason, all papers in the book have been reviewed by, at least, two referees, and care has been taken to ensure that the reviewer comments have been incorporated in the final version of the manuscript. We are thankful to the members of the scientific committee for their timely review of the scripts. All in all, there are three keynote papers from experts in soil science, climatology and ecology and 43 contributed papers providing a good indication of the status of geostatistics as applied in the environ­ mental field all over the world. We feel now confident that the geoENV conference series, seeded around a coffee table almost six...

  1. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  2. Evaluating factorial kriging for seismic attributes filtering: a geostatistical filter applied to reservoir characterization; Avaliacao da krigagem fatorial na filtragem de atributos sismicos: um filtro geoestatistico aplicado a caracterizacao de reservatorios

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Evaldo Cesario

    1999-02-01

    In this dissertation the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial, domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes via External Drift Kriging and Collocated Cokriging in the estimate of reservoir characterization is discussed. Its relevance for the reservoir porous volume calculation is also evaluated based on comparative analysis of the volume risk curves derived from stochastic conditional simulations with collocated variable and stochastic conditional simulations with collocated variable and stochastic conditional simulations with external drift. results prove Factorial Kriging as an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)

  3. Wavelet analysis applied to the IRAS cirrus

    Science.gov (United States)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  4. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  5. Application of Geostatistics to the resolution of structural problems in homogeneous rocky massifs

    International Nuclear Information System (INIS)

    The nature and possibilities of application of intrinsic functions to the structural research and the delimitation of the areas of influence in an ore deposit are briefly described. Main models to which the different distributions may be assimilated: 'logarithmic' and 'linear' among those with no sill value, and on the other hand, 'spherical', 'exponential' and 'gaussian' among those having a sill level, which allows the establishment of a range value liable to separate the field of independent samples from that of non-independent ones are shown. Thereafter as an original contribution to applied geostatistics the autor postulates 1) the application of the 'fracturing rank' as a regionalized variable after verifying its validity through strict probabilistic methodologies, and 2) a methodological extension of the conventional criterion of 'rock quality designation' to the analysis of the quality and degree of structural discontinuity in the rock surface. Finally, some examples are given of these applications. (M.E.L.)

  6. Delineation of Management Zones in Precision Agriculture by Integration of Proximal Sensing with Multivariate Geostatistics. Examples of Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Annamaria Castrignanò

    2015-07-01

    Full Text Available Fundamental to the philosophy of Precision Agriculture (PA is the concept of matching inputs to needs. Recent research in PA has focused on use of Management Zones (MZ that are field areas characterised by homogeneous attributes in landscape and soil conditions. Proximal sensing (such as Electromagnetic Induction (EMI, Ground Penetrating Radar (GPR and X-ray fluorescence can complement direct sampling and a multisensory platform can enable us to map soil features unambiguously. Several methods of multi-sensor data analysis have been developed to determine the location of subfield areas. Modern geostatistical techniques, treating variables as continua in a joint attribute and geographic space, offer the potential to analyse such data effectively. The objective of the paper is to show the potential of multivariate geostatistics to create MZ in the perspective of PA by integrating field data from different types of sensors, describing two study cases. In particular, in the first case study, cokriging and factorial cokriging were employed to produce thematic maps of soil trace elements and to delineate homogenous zones, respectively. In the second case, a multivariate geostatistical data-fusion technique (multi collocated cokriging was applied to different geophysical sensor data (GPR and EMI, for stationary estimation of soil water content and for delineating within-field zone with different wetting degree. The results have shown that linking sensors of different type improves the overall assessment of soil and sensor data fusion could be effectively applied to delineate MZs in Precision Agriculture. However, techniques of data integration are urgently required as a result of the proliferation of data from different sources.

  7. Delineation of Management Zones in Precision Agriculture by Integration of Proximal Sensing with Multivariate Geostatistics. Examples of Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Annamaria Castrignanò

    2016-01-01

    Full Text Available Fundamental to the philosophy of Precision Agriculture (PA is the concept of matching inputs to needs. Recent research in PA has focused on use of Management Zones (MZ that are field areas characterised by homogeneous attributes in landscape and soil conditions. Proximal sensing (such as Electromagnetic Induction (EMI, Ground Penetrating Radar (GPR and X-ray fluorescence can complement direct sampling and a multisensory platform can enable us to map soil features unambiguously. Several methods of multi-sensor data analysis have been developed to determine the location of subfield areas. Modern geostatistical techniques, treating variables as continua in a joint attribute and geographic space, offer the potential to analyse such data effectively. The objective of the paper is to show the potential of multivariate geostatistics to create MZ in the perspective of PA by integrating field data from different types of sensors, describing two study cases. In particular, in the first case study, cokriging and factorial cokriging were employed to produce thematic maps of soil trace elements and to delineate homogenous zones, respectively. In the second case, a multivariate geostatistical data-fusion technique (multi collocated cokriging was applied to different geophysical sensor data (GPR and EMI, for stationary estimation of soil water content and for delineating within-field zone with different wetting degree. The results have shown that linking sensors of different type improves the overall assessment of soil and sensor data fusion could be effectively applied to delineate MZs in Precision Agriculture. However, techniques of data integration are urgently required as a result of the proliferation of data from different sources.

  8. Geostatistical Study of Precipitation on the Island of Crete

    Science.gov (United States)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    Understanding and predicting the spatiotemporal patterns of precipitation in the Mediterranean islands is an important topic of research, which is emphasized by alarming long-term predictions for increased drought conditions [4]. The analysis of records from drought-prone areas around the world has demonstrated that precipitation data are non-Gaussian. Typically, such data are fitted to the gamma distribution function and then transformed into a normalized index, the so-called Standardized Precipitation Index (SPI) [5]. The SPI can be defined for different time scales and has been applied to data from various regions [2]. Precipitation maps can be constructed using the stochastic method of Ordinary Kriging [1]. Such mathematical tools help to better understand the space-time variability and to plan water resources management. We present preliminary results of an ongoing investigation of the space-time precipitation distribution on the island of Crete (Greece). The study spans the time period from 1948 to 2012 and extends over an area of 8 336 km2. The data comprise monthly precipitation measured at 56 stations. Analysis of the data showed that the most severe drought occurred in 1950 followed by 1989, whereas the wettest year was 2002 followed by 1977. A spatial trend was observed with the spatially averaged annual precipitation in the West measured at about 450mm higher than in the East. Analysis of the data also revealed strong correlations between the precipitation in the western and eastern parts of the island. In addition to longitude, elevation (masl) was determined to be an important factor that exhibits strong linear correlation with precipitation. The precipitation data exhibit wet and dry periods with strong variability even during the wet period. Thus, fitting the data to specific probability distribution models has proved challenging. Different time scales, e.g. monthly, biannual, and annual have been investigated. Herein we focus on annual

  9. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  10. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Lloyd A. [Leading Solutions, LLC.; Paresol, Bernard [U.S. Department of Agriculture, Forest Service, Pacific Northwest Research Station, Portland, OR.

    2014-09-01

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  11. Geostatistical modeling of topography using auxiliary maps

    NARCIS (Netherlands)

    T. Hengl; B. Bajat; D. Blagojević; H.I. Reuter

    2008-01-01

    This paper recommends computational procedures for employing auxiliary maps, such as maps of drainage patterns, land cover and remote-sensing-based indices, directly in the geostatistical modeling of topography. The methodology is based on the regression-kriging technique, as implemented in the R pa

  12. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  13. Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics

    Directory of Open Access Journals (Sweden)

    Swatantra R. Kethireddy

    2014-01-01

    Full Text Available Tropospheric ozone (O3 pollution is a major problem worldwide, including in the United States of America (USA, particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.

  14. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    Science.gov (United States)

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  15. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  16. Spatial analysis methodology applied to rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)

    2006-08-15

    The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)

  17. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    the worlds of statistics and chemometrics. We want to provide a glimpse of the essential and complex data pre-processing that is well known to chemometricians, but is generally unknown to statisticians. Pre-processing can potentially have a strong in uence on the results of consequent data analysis. Our......In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... known by Ramsay & Silverman's monograph in 1997. In functional data analysis, the data are curves instead of data points. Each curve is measured at discrete points along a continuum, for example, time or frequency. It is assumed that the underlying process generating the curves is smooth...

  18. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  19. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    Next generation sequencing (NGS) has revolutionized the field of genomics and its wide range of applications has resulted in the genome-wide analysis of hundreds of species and the development of thousands of computational tools. This thesis represents my work on NGS analysis of four species, Lotus...... japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... agricultural and biological importance. Its capacity to form symbiotic relationships with rhizobia and microrrhizal fungi has fascinated researchers for years. Lotus has a small genome of approximately 470 Mb and a short life cycle of 2 to 3 months, which has made Lotus a model legume plant for many molecular...

  20. Geostatistical Evaluation of Fracture Frequency and Crushing

    OpenAIRE

    Séguret, Serge Antoine; Guajardo Moreno, Cristian; Freire Rivera, Ramon

    2014-01-01

    International audience This work details how to estimate the Fracture Frequency (FF), ratio of a number of fractures divided by a sample length. The difficulty is that often, a part of the sample cannot be analyzed by the geologist because it is crushed, a characteristics of the rock strength that must also be considered for the Rock Mass Rating. After analyzing the usual practices, the paper describes the (geo)statistical link between fracturing and crushing and the resulting method to ob...

  1. MULTIVARIATE GEOSTATISTICAL METHODS FOR MAPPING SOIL SALINITY

    OpenAIRE

    BİLGİLİ, A.V.; ÇULLU, M.A.; AYDEMİR, A.; Turan, V; SÖNMEZ, O.; AYDEMİR, S.; Kaya, C.

    2012-01-01

    Degradation of the lands by salinity under arid climate and poor drainage conditions can be inevitable. In the Harran plain total salt affected areas covers 10 % of total irrigated areas which are mainly located in the low lying parts of the plain where elevation ranges from 350 to 400 m. Soil salinity shows high spatial variability which requires intensive sampling and laboratory analyses. Geostatistical techniques such as simple or ordinary kriging can be used in explaining this spatial var...

  2. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  3. Applying centrality measures to impact analysis: A coauthorship network analysis

    CERN Document Server

    Yan, Erjia

    2010-01-01

    Many studies on coauthorship networks focus on network topology and network statistical mechanics. This article takes a different approach by studying micro-level network properties, with the aim to apply centrality measures to impact analysis. Using coauthorship data from 16 journals in the field of library and information science (LIS) with a time span of twenty years (1988-2007), we construct an evolving coauthorship network and calculate four centrality measures (closeness, betweenness, degree and PageRank) for authors in this network. We find out that the four centrality measures are significantly correlated with citation counts. We also discuss the usability of centrality measures in author ranking, and suggest that centrality measures can be useful indicators for impact analysis.

  4. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  5. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  6. Combined assimilation of streamflow and satellite soil moisture with the particle filter and geostatistical modeling

    Science.gov (United States)

    Yan, Hongxiang; Moradkhani, Hamid

    2016-08-01

    Assimilation of satellite soil moisture and streamflow data into a distributed hydrologic model has received increasing attention over the past few years. This study provides a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. Performance is assessed over the Salt River Watershed in Arizona, which is one of the watersheds without anthropogenic effects in Model Parameter Estimation Experiment (MOPEX). A total of five data assimilation (DA) scenarios are designed and the effects of the locations of streamflow gauges and the ASCAT soil moisture on the predictions of soil moisture and streamflow are assessed. In addition, a geostatistical model is introduced to overcome the significantly biased satellite soil moisture and also discontinuity issue. The results indicate that: (1) solely assimilating outlet streamflow can lead to biased soil moisture estimation; (2) when the study area can only be partially covered by the satellite data, the geostatistical approach can estimate the soil moisture for those uncovered grid cells; (3) joint assimilation of streamflow and soil moisture from geostatistical modeling can further improve the surface soil moisture prediction. This study recommends that the geostatistical model is a helpful tool to aid the remote sensing technique and the hydrologic DA study.

  7. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  8. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  9. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  10. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    Science.gov (United States)

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  11. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  12. Applying Discourse Analysis in ELT: a Five Cs Model

    Institute of Scientific and Technical Information of China (English)

    肖巧慧

    2009-01-01

    Based on a discussion of definitions on Discourse analysis,discourse is regard as layers consist of five elements--cohesion, coherence, culture, critique and context. Moreover, we focus on applying DA in ELT.

  13. Geostatistical sampling optimization and waste characterization of contaminated premises

    International Nuclear Information System (INIS)

    At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires a radiological assessment of the building structure residual activity. From this point of view, the set up of an appropriate evaluation methodology is of crucial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive) control of the emergent signal is commonly performed using in situ measurement methods such as surface controls combined with in situ gamma spectrometry. Finally, in order to assess the contamination depth, samples are collected at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data allow the definition of a preliminary waste zoning. The exhaustive control of the emergent signal with surface measurements usually leads to inaccurate estimates, because of several factors: varying position of the measuring device, subtraction of an estimate of the background signal, etc. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The initial activity usually presents a spatial continuity within the premises, with preferential contamination of specific areas or existence of activity gradients. Taking into account this spatial continuity is essential to avoid bias while setting up the sampling plan. In such a case, Geostatistics provides methods that integrate the contamination spatial structure. After the characterization of this spatial structure, most probable estimates of the surface activity at un-sampled locations can be derived using kriging techniques. Variants of these techniques also give access to estimates of the uncertainty associated to the spatial

  14. Geostatistical sampling optimization and waste characterization of contaminated premises

    Energy Technology Data Exchange (ETDEWEB)

    Desnoyers, Y.; Jeannee, N. [GEOVARIANCES, 49bis avenue Franklin Roosevelt, BP91, Avon, 77212 (France); Chiles, J.P. [Centre de geostatistique, Ecole des Mines de Paris (France); Dubot, D. [CEA DSV/FAR/USLT/SPRE/SAS (France); Lamadie, F. [CEA DEN/VRH/DTEC/SDTC/LTM (France)

    2009-06-15

    At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires a radiological assessment of the building structure residual activity. From this point of view, the set up of an appropriate evaluation methodology is of crucial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive) control of the emergent signal is commonly performed using in situ measurement methods such as surface controls combined with in situ gamma spectrometry. Finally, in order to assess the contamination depth, samples are collected at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data allow the definition of a preliminary waste zoning. The exhaustive control of the emergent signal with surface measurements usually leads to inaccurate estimates, because of several factors: varying position of the measuring device, subtraction of an estimate of the background signal, etc. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The initial activity usually presents a spatial continuity within the premises, with preferential contamination of specific areas or existence of activity gradients. Taking into account this spatial continuity is essential to avoid bias while setting up the sampling plan. In such a case, Geostatistics provides methods that integrate the contamination spatial structure. After the characterization of this spatial structure, most probable estimates of the surface activity at un-sampled locations can be derived using kriging techniques. Variants of these techniques also give access to estimates of the uncertainty associated to the spatial

  15. Treatment integrity in applied behavior analysis with children.

    OpenAIRE

    F. M. Gresham; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Beha...

  16. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    Science.gov (United States)

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging.

  17. A methodological approach for the geostatistical characterization of radiological contaminations in nuclear facilities

    International Nuclear Information System (INIS)

    The decommissioning of nuclear sites represents huge industrial and financial challenges. At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires the radiological assessment of residual activity levels of the building structures (waste zoning). From this point of view, the set up of an appropriate evaluation methodology is of prime importance. The developed radiological characterization is divided into three steps: historical and functional analysis, radiation mapping and in-depth investigations. Combined with relevant data analysis and processing tools, this methodology aims at optimizing the investigation costs and the radiological waste volumes. In a former CEA nuclear facility, a substantial radiation survey was performed at the beginning of the thesis in order to try and test the methodology. The relevance of geo-statistics was also illustrated: exploratory data analysis, spatial structure analysis through the variogram, mapping, risk analysis results, iso-factorial modeling to take the structuring of extreme values into account. Destructive concrete samples are then integrated to complete the characterization in addition to radiation data through the geostatistical multivariate approach. Finally, waste segregation results from the risk of exceeding given radiological thresholds. An inventory of the spatial structures of radiological contaminations in nuclear facilities is provided within the geostatistical framework. It leads to sampling recommendations in order to improve the overall characterization strategy and optimize the investigation costs. (author)

  18. Geochemical and geostatistical evaluation, Arkansas Canyon Planning Unit, Fremont and Custer Counties, Colorado

    Science.gov (United States)

    Weiland, E.F.; Connors, R.A.; Robinson, M.L.; Lindemann, J.W.; Meyer, W.T.

    1982-01-01

    A mineral assessment of the Arkansas Canyon Planning Unit was undertaken by Barringer Resources Inc., under the terms of contract YA-553-CTO-100 with the Bureau of Land Management, Colorado State Office. The study was based on a geochemical-geostatistical survey in which 700 stream sediment samples were collected and analyzed for 25 elements. Geochemical results were interpreted by statistical processing which included factor, discriminant, multiple regression and characteristic analysis. The major deposit types evaluated were massive sulfide-base metal, sedimentary and magmatic uranium, thorium vein, magmatic segregation, and carbonatite related deposits. Results of the single element data and multivariate geostatistical analysis indicate that limited potential exists for base metal mineralization near the Horseshoe, El Plomo, and Green Mountain Mines. Thirty areas are considered to be anomalous with regard to one or more of the geochemical parameters evaluated during this study. The evaluation of carbonatite related mineralization was restricted due to the lack of geochemical data specific to this environment.

  19. Ore reserve evalution, through geostatistical methods, in sector C-09, Pocos de Caldas, MG-Brazil

    International Nuclear Information System (INIS)

    In sector C-09, Pocos de Caldas in the state of Minas Gerais, geostatistical techniques have been used to evaluate the tonnage of U3O8 and associated minerals and to delimit ore from sterile areas. The calculation of reserve was based on borehole information including the results of chemical and/or radiometric analysis. Two-and three dimensional evalutions were made following the existing geological models. Initially, the evaluation was based on chemical analysis using the more classical geostatistical technique of kriging. This was followed by a second evaluation using the more recent technique of co-kriging which permited the incorporation of radiometric information in the calculations. The correlation between ore grade and radiometric was studied using the method of cross-covariance. Following restrictions imposed by mining considerations, a probabilistic selection was made of blocks of appropriate dimensions so as to evaluate the grade tonnage curve for each panel. (Author)

  20. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    Energy Technology Data Exchange (ETDEWEB)

    Watson, William D. [US Department of Energy, Energy Information Administration, 950 L' Enfant Plaza South Building EI-52, 1000 Independence Avenue SW, 20585 Washington, DC (United States); Ruppert, Leslie F.; Bragg, Linda J.; Tewalt, Susan J. [US Geological Survey, National Center, MS 956, 20192 Reston, VA (United States)

    2001-12-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of 'similar' data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours.

  1. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  2. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  3. Geostatistical analysis on spatial variability of suspended solids concentration in the coast water of Hongkong%香港近岸海域悬浮固体浓度空间变异特征的地统计分析

    Institute of Scientific and Technical Information of China (English)

    吕君伟; 刘湘南

    2014-01-01

    This article studied the spatial distribution and the variability of suspended solids concentration (SSC) in the coast water of Hongkong leveraging the HJ-1A/B CCD data of 2009~2012and the measured data. The remote sensing inversion model had been set up. Based on which, the analysis was conducted by means of the geostatistical method. SSC in the north part of the area (i.e., the Shenzhen bay) was higher than other areas all the year around;SSC in the northeast part of area (i.e., the estuary of the Pearl River) fluctuated remarkably from seasons to seasons;SSC in the south of Lautau island was relatively lower all the year around, and fluctuated from seasons to seasons. The study of data from 2009to 2012indicated that the total SSC in the area was low. Specifically, the average SSC was 12.25mg/L, and SSC was higher in the north part and lower in the south part; SSC was characterized as remarkable variation in spatial and temporal dimensions. Besides, SCC showed a strong autocorrelation in spatial dimension, where the value of nugget/sill was lowest in March (0.0005).%以香港近岸海域为研究区,基于2009~2012年HJ-1A/B CCD数据和实测数据,建立悬浮固体浓度遥感反演模型,运用地统计学方法对模型反演结果进行分析,探索研究区内悬浮固体浓度空间分布及其变异特征.结果表明:香港近岸海域北部深圳湾附近常年处于悬浮固体浓度的高值区,东北部珠江入海口海域悬浮固体浓度受季节影响明显,大屿山以南的南部海域的浓度值常年处于较低水平,并具有明显的季节特征.2009~2012年的地统计数据表明,香港近岸海域悬浮固体浓度值总体处于较低水平,月均值为12.25mg/L,由北向南递减趋势明显.全年大部分时间空间变异特征明显,空间自相关性较强,其中3月份的块金值/基台值最小,仅约为0.0005.

  4. ANIMAL RESEARCH IN THE JOURNAL OF APPLIED BEHAVIOR ANALYSIS

    OpenAIRE

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say–do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by...

  5. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  6. Applied Methods for Analysis of Economic Structure and Change

    OpenAIRE

    Anderstig, Christer

    1988-01-01

    The thesis comprises five papers and an introductory overview of applied models and methods. The papers concern interdependences and interrelations in models applied to empirical analyses of various problems related to production, consumption, location and trade. Among different definitions of 'structural analysis' one refers to the study of the properties of economic models on the assumption of invariant structural relations, this definition is close to what is aimed at in lire present case....

  7. Geostatistical inference using crosshole ground-penetrating radar

    DEFF Research Database (Denmark)

    Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou;

    2010-01-01

    High-resolution tomographic images obtained from crosshole geophysical measurements have the potential to provide valuable information about the geostatistical properties of unsaturated-zone hydrologic-state va riables such as moisture content. Under drained or quasi-steady-state conditions...... wave velocity structure, which may diminish the utility of these images for geostatistical inference. We have used a linearized stochastic inversion technique to infer the geostatistical properties of the subsurface radar wave velocity distribution using crosshole GPR traveltimes directly. Expanding...... of the subsurface are used to evaluate the uncertainty of the inversion estimate. We have explored the full potential of the geostatistical inference method using several synthetic models of varying correlation structures and have tested the influence of different assumptions concerning the choice of covariance...

  8. Simultaneous inversion of petrophysical parameters based on geostatistical a priori information

    Institute of Scientific and Technical Information of China (English)

    Yin Xing-Yao; Sun Rui-Ying; Wang Bao-Li; Zhang Guang-Zhi

    2014-01-01

    The high-resolution nonlinear simultaneous inversion of petrophysical parameters is based on Bayesian statistics and combines petrophysics with geostatistical a priori information. We used the fast Fourier transform-moving average (FFT-MA) and gradual deformation method (GDM) to obtain a reasonable variogram by using structural analysis and geostatistical a priori information of petrophysical parameters. Subsequently, we constructed the likelihood function according to the statistical petrophysical model. Finally, we used the Metropolis algorithm to sample the posteriori probability density and complete the inversion of the petrophysical parameters. We used the proposed method to process data from an oil fi eld in China and found good match between inversion and real data with high-resolution. In addition, the direct inversion of petrophysical parameters avoids the error accumulation and decreases the uncertainty, and increases the computational effi ciency.

  9. Interannual Changes in Biomass Affect the Spatial Aggregations of Anchovy and Sardine as Evidenced by Geostatistical and Spatial Indicators.

    Directory of Open Access Journals (Sweden)

    Marco Barra

    Full Text Available Geostatistical techniques were applied and a series of spatial indicators were calculated (occupation, aggregation, location, dispersion, spatial autocorrelation and overlap to characterize the spatial distributions of European anchovy and sardine during summer. Two ecosystems were compared for this purpose, both located in the Mediterranean Sea: the Strait of Sicily (upwelling area and the North Aegean Sea (continental shelf area, influenced by freshwater. Although the biomass of anchovy and sardine presented high interannual variability in both areas, the location of the centres of gravity and the main spatial patches of their populations were very similar between years. The size of the patches representing the dominant part of the abundance (80% was mostly ecosystem- and species-specific. Occupation (area of presence appears to be shaped by the extent of suitable habitats in each ecosystem whereas aggregation patterns (how the populations are distributed within the area of presence were species-specific and related to levels of population biomass. In the upwelling area, both species showed consistently higher occupation values compared to the continental shelf area. Certain characteristics of the spatial distribution of sardine (e.g. spreading area, overlapping with anchovy differed substantially between the two ecosystems. Principal component analysis of geostatistical and spatial indicators revealed that biomass was significantly related to a suite of, rather than single, spatial indicators. At the spatial scale of our study, strong correlations emerged between biomass and the first principal component axis with highly positive loadings for occupation, aggregation and patchiness, independently of species and ecosystem. Overlapping between anchovy and sardine increased with the increase of sardine biomass but decreased with the increase of anchovy. This contrasting pattern was attributed to the location of the respective major patches

  10. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    International Nuclear Information System (INIS)

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing

  11. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  12. Optimizing Groundwater Monitoring Networks Using Integrated Statistical and Geostatistical Approaches

    Directory of Open Access Journals (Sweden)

    Jay Krishna Thakur

    2015-08-01

    Full Text Available The aim of this work is to investigate new approaches using methods based on statistics and geo-statistics for spatio-temporal optimization of groundwater monitoring networks. The formulated and integrated methods were tested with the groundwater quality data set of Bitterfeld/Wolfen, Germany. Spatially, the monitoring network was optimized using geo-statistical methods. Temporal optimization of the monitoring network was carried out using Sen’s method (1968. For geostatistical network optimization, a geostatistical spatio-temporal algorithm was used to identify redundant wells in 2- and 2.5-D Quaternary and Tertiary aquifers. Influences of interpolation block width, dimension, contaminant association, groundwater flow direction and aquifer homogeneity on statistical and geostatistical methods for monitoring network optimization were analysed. The integrated approach shows 37% and 28% redundancies in the monitoring network in Quaternary aquifer and Tertiary aquifer respectively. The geostatistical method also recommends 41 and 22 new monitoring wells in the Quaternary and Tertiary aquifers respectively. In temporal optimization, an overall optimized sampling interval was recommended in terms of lower quartile (238 days, median quartile (317 days and upper quartile (401 days in the research area of Bitterfeld/Wolfen. Demonstrated methods for improving groundwater monitoring network can be used in real monitoring network optimization with due consideration given to influencing factors.

  13. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  14. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  15. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  16. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling

    Directory of Open Access Journals (Sweden)

    Laura Grisotto

    2016-04-01

    Full Text Available In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.

  17. Computer tomography data on sol structural and hydraulic parameters assessed for spatial continuity by semivariance geostatistics

    International Nuclear Information System (INIS)

    Visual observations on the spatial distribution, at 1-cm intervals, of bulk density (ρ), porosity (ε), fractal dimension (D), water content (θ), and unsaturated hydraulic conductivity (Kus) in uniformly packed soil columns showed randomness. We explore the use of semivariance geostatistics to clarify the issue of randomness and continuity on the spatial distribution of ρ, ε, D, θ, and Kus data obtained using a custom-built gamma scanner and computed tomography technique. Semivariance increased with increasing lag distance and plots of semivariance Iag distance produced spherical semivariograms for most of the soil parameters investigated. This indicated that even though randomness existed in the spatial distribution of the soil parameters, there existed specific trends in their spatial continuity. Higher spatial continuity, in water stable aggregates, was characterised by smaller values of semi-, sill-,and nugget-variances and larger values of span. Opposite trends were observed for unstable aggregates. Wetting in unstable aggregates produced further reductions in span increases for other geostatistical parameters, indicating that wetting decreased spatial continuity. The results indicate that geostatistical analysis is useful to clarify the issue of randomnes at very small scales and to quantify and discriminate the influence of differences in structural stability and wetting-induced changes in the spatial continuity of soil parameters, particularly ε. Copyright (1998) CSIRO Australia

  18. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    Science.gov (United States)

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  19. The Split-Apply-Combine Strategy for Data Analysis

    Directory of Open Access Journals (Sweden)

    Hadley Wickham

    2011-04-01

    Full Text Available Many data analysis problems involve the application of a split-apply-combine strategy, where you break up a big problem into manageable pieces, operate on each piece independently and then put all the pieces back together. This insight gives rise to a new R package that allows you to smoothly apply this strategy, without having to worry about the type of structure in which your data is stored.The paper includes two case studies showing how these insights make it easier to work with batting records for veteran baseball players and a large 3d array of spatio-temporal ozone measurements.

  20. Geostatistical estimation of the transmissivity in a highly fractured metamorphic and crystalline aquifer (Man-Danane Region, Western Ivory Coast)

    Science.gov (United States)

    Razack, Moumtaz; Lasm, Théophile

    2006-06-01

    This work is aimed at estimating the transmissivity of highly fractured hard rock aquifers using a geostatistical approach. The studied aquifer is formed by the crystalline and metamorphic rocks of the Western Ivory Coast (West Africa), in the Man Danané area. The study area covers 7290 km 2 (90 km×81 km). The fracturing network is dense and well connected, without a marked fracture direction. A data base comprising 118 transmissivity ( T) values and 154 specific capacity ( Q/ s) values was compiled. A significant empirical relationship between T and Q/ s was found, which enabled the transmissivity data to be supplemented. The variographic analysis of the two variables showed that the variograms of T and Q/ s (which are lognormal variables) are much more structured than those of log T and log Q/ s (which are normal variables). This result is contrary to what was previously published and raises the question whether normality is necessary in geostatistical analysis. Several input and geostatistical estimations of the transmissivity were tested using the cross validation procedure: measured transmissivity data; supplemented transmissivity data; kriging; cokriging. The cross validation results showed that the best estimation is provided using the kriging procedure, the transmissivity field represented by the whole data sample (measured+estimated using specific capacity) and the structural model evaluated solely on the measured transmissivity. The geostatistical approach provided in fine a reliable estimation of the transmissivity of the Man Danané aquifer, which will be used as an input in forthcoming modelling.

  1. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    Science.gov (United States)

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  2. Numerical continuation applied to landing gear mechanism analysis

    OpenAIRE

    Knowles, J.; Krauskopf, B; Lowenberg, MH

    2010-01-01

    A method of investigating quasi-static mechanisms is presented and applied to an overcentre mechanism and to a nose landing gear mechanism. The method uses static equilibrium equations along with equations describing the geometric constraints in the mechanism. In the spirit of bifurcation analysis, solutions to these steady-state equations are then continued numerically in parameters of interest. Results obtained from the bifurcation method agree with the equivalent results obtained from two ...

  3. Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling

    OpenAIRE

    Herrmann, Robert A.

    2003-01-01

    This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...

  4. An applied ethics analysis of best practice tourism entrepreneurs

    OpenAIRE

    Power, Susann

    2015-01-01

    Ethical entrepreneurship and by extension wider best practice are noble goals for the future of tourism. However, questions arise which concepts, such as values motivations, actions and challenges underpin these goals. This thesis seeks to answers these questions and in so doing develop an applied ethics analysis for best practice entrepreneurs in tourism. The research is situated in sustainable tourism, which is ethically very complex and has thus far been dominated by the economic, social a...

  5. Recent reinforcement-schedule research and applied behavior analysis

    OpenAIRE

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule pe...

  6. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  7. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  8. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    Science.gov (United States)

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines. PMID:21175361

  9. Geostatistical description of geological heterogeneity in clayey till as input for improved characterization of contaminated sites

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Klint, K.E.S.; Renard, P.;

    2010-01-01

    In low-permeability clay tills subsurface transport is governed by preferential flow in sand lenses and fractures. A proper geological model requires the integration of these features, i.e. the spatial distribution of the geological heterogeneities. Detailed mapping of sand lenses has been done...... at a clay till outcrop in Denmark to characterise the shapes and the spatial variability. Further, geostatistics were applied to simulate the distribution and to develop a heterogeneity model that can be incorporated into an existing geological model of, for example, a contaminated site....

  10. Spatial Pattern of Great Lakes Estuary Processes from Water Quality Sensing and Geostatistical Methods

    Science.gov (United States)

    Xu, W.; Minsker, B. S.; Bailey, B.; Collingsworth, P.

    2014-12-01

    Mixing of river and lake water can alter water temperature, conductivity, and other properties that influence ecological processes in freshwater estuaries of the Great Lakes. This study uses geostatistical methods to rapidly visualize and understand water quality sampling results and enable adaptive sampling to remove anomalies and explore interesting phenomena in more detail. Triaxus, a towed undulating sensor package, was used for collecting various physical and biological water qualities in three estuary areas of Lake Michigan in Summer 2011. Based on the particular sampling pattern, data quality assurance and quality control (QA/QC) processes, including sensor synchronization, upcast and downcast separation, and spatial outlier removal are first applied. An automated kriging interpolation approach that considers trend and anisotropy is then proposed to estimate data on a gridded map for direct visualization. Other methods are explored with the data to gain more insights on water quality processes. Local G statistics serve as a supplementary tool to direct visualization. The method identifies statistically high value zones (hot spots) and low value zones (cold spots) in water chemistry across the estuaries, including locations of water sources and intrusions. In addition, chlorophyll concentration distributions are different among sites. To further understand the interactions and differences between river and lake water, K-means clustering algorithm is used to spatially cluster the water based on temperature and specific conductivity. Statistical analysis indicates that clusters with significant river water can be identified from higher turbidity, specific conductivity, and chlorophyll concentrations. Different ratios between zooplankton biomass and density indicate different zooplankton structure across clusters. All of these methods can contribute to improved near real-time analysis of future sampling activity.

  11. Cladistic analysis applied to the classification of volcanoes

    Science.gov (United States)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  12. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology.

  13. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  14. Empirical modal decomposition applied to cardiac signals analysis

    Science.gov (United States)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  15. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  16. Using Multivariate Statistical and Geostatistical Methods to Identify Spatial Variability of Trace Elements in Agricultural Soils in Dongguan City,Guangdong,China

    Institute of Scientific and Technical Information of China (English)

    Dou Lei; Zhou Yongzhang; Ma Jin; Li Yong; Cheng Qiuming; Xie Shuyun; Du Haiyan; You Yuanhang; Wan Hongfu

    2008-01-01

    Dongguan (东莞) City, located in the Pearl River Delta, South China, is famous for its rapid industrialization in the past 30 years. A total of 90 topsoil samples have been collected from agricultural fields, including vegetable and orchard soils in the city, and eight heavy metals (As, Cu, Cd,Cr, Hg, Ni, Pb, and Zn) and other items (pH values and organic matter) have been analyzed, to evaluate the influence of anthropie activities on the environmental quality of agricultural soils and to identify the spatial distribution of trace elements and possible sources of trace elements. The elements Hg, Pb, and Cd have accumulated remarkably here, incomparison with the soil background content of elements in Guangdong (广东) Province. Pollution is more serious in the western plain and the central region, which are heavily distributed with industries and rivers. Multivariate and geostatistical methods have been applied to differentiate the influences of natural processes and human activities on the pollution of heavy metals in topsoils in the study area. The results of cluster analysis (CA) and factor analysis (FA) show that Ni, Cr, Cu, Zn, and As are grouped in factor F1,Pb in F2, and Cd and Hg in F3, respectively. The spatial pattern of the three factors may be well demonstrated by geostatistical analysis. It is shown that the first factor could be considered as a natural source controlled by parent rocks. The second factor could be referred to as "industrial and traffic pollution sources". The source of the third factor is mainly controlled by long-term anthropic activities ,ad a consequence of agricultural fossil fuel consumption and atmospheric deposition.

  17. 白背飞虱长翅型空间格局的地统计学分析%Geostatistical Analysis on Spatial Distribution of Macropterous Whitebacked Planthopper,Sogatella furcifera(Horváth)

    Institute of Scientific and Technical Information of China (English)

    闫香慧; 黄燕

    2012-01-01

    白背飞虱Sogatella furcifera(Horváth)是我国重要的迁飞性水稻害虫,对我国的水稻生产造成严重危害.为了解其迁入后的聚集与扩散的动态过程和空间分布规律,为综合防治提供理论依据,本文根据2008年在秀山县的田间系统调查资料,运用地统计学中的半方差函数,研究白背飞虱长翅型成虫迁入后在稻田间的聚集与扩散的动态过程和空间分布规律,建立了自迁入至迁出在田间东西和南北两个方向上的空间变异曲线模型,并利用Surfer8.0软件对空间分布数据进行插值和模拟.结果表明,白背飞虱长翅型密度越高,空间变量的变化幅度越大;由随机因数引起的空间变异平均为37.6%,由自相关因数引起的空间变异为62.4%,各调查时间东西方向的空间相关范围都小于南北方向,前者平均为12.86m,后者为28.85m;空间插值表明白背飞虱长翅型种群在稻田的聚集斑块南北方向比东西方向长,即南北方向是白背飞虱长翅型聚集和扩散的主方向.%Whitebacked planthopper,Sogatella furcifera(Horváth) is a major immigration pest of rice crops in China.In order to provide a theoretical basis for its integrated control,dynamic process and spatial pattern of macropterous Sogatella furcifera were studied in the period from their immigration to emigration by using geostatistical methods in Xiushan county,and semivariogram curve models were established in directions from north to south and west to east.Isoclines maps of the pest at each stage were set by the geostatistical software Surfer 8.0 with Kriging interpolation.The variograms showed the higher the density was,the larger the space variation scope became.Average space variation was 37.6% caused by the random factor and 62.4% caused by the autocorrelation.The random degree of space variation became greater as the rice grew up.Space-related distance was 12.86 m in the direction of east-west and 28.85 m in south

  18. Geostatistical inference using crosshole ground-penetrating radar

    DEFF Research Database (Denmark)

    Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou;

    2010-01-01

    High-resolution tomographic images obtained from crosshole geophysical measurements have the potential to provide valuable information about the geostatistical properties of unsaturated-zone hydrologic-state va riables such as moisture content. Under drained or quasi-steady-state conditions...

  19. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  20. Downside Risk analysis applied to Hedge Funds universe

    CERN Document Server

    Perello, J

    2006-01-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.

  1. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  2. Evaluating the effectiveness of teacher training in Applied Behaviour Analysis.

    Science.gov (United States)

    Grey, Ian M; Honan, Rita; McClean, Brian; Daly, Michael

    2005-09-01

    Interventions for children with autism based upon Applied Behaviour Analysis (ABA) has been repeatedly shown to be related both to educational gains and to reductions in challenging behaviours. However, to date, comprehensive training in ABA for teachers and others have been limited. Over 7 months, 11 teachers undertook 90 hours of classroom instruction and supervision in ABA. Each teacher conducted a comprehensive functional assessment and designed a behaviour support plan targeting one behaviour for one child with an autistic disorder. Target behaviours included aggression, non-compliance and specific educational skills. Teachers recorded observational data for the target behaviour for both baseline and intervention sessions. Support plans produced an average 80 percent change in frequency of occurrence of target behaviours. Questionnaires completed by parents and teachers at the end of the course indicated a beneficial effect for the children and the educational environment. The potential benefits of teacher implemented behavioural intervention are discussed. PMID:16144826

  3. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  4. Geostatistical joint inversion of seismic and potential field methods

    Science.gov (United States)

    Shamsipour, Pejman; Chouteau, Michel; Giroux, Bernard

    2016-04-01

    Interpretation of geophysical data needs to integrate different types of information to make the proposed model geologically realistic. Multiple data sets can reduce uncertainty and non-uniqueness present in separate geophysical data inversions. Seismic data can play an important role in mineral exploration, however processing and interpretation of seismic data is difficult due to complexity of hard-rock geology. On the other hand, the recovered model from potential field methods is affected by inherent non uniqueness caused by the nature of the physics and by underdetermination of the problem. Joint inversion of seismic and potential field data can mitigate weakness of separate inversion of these methods. A stochastic joint inversion method based on geostatistical techniques is applied to estimate density and velocity distributions from gravity and travel time data. The method fully integrates the physical relations between density-gravity, on one hand, and slowness-travel time, on the other hand. As a consequence, when the data are considered noise-free, the responses from the inverted slowness and density data exactly reproduce the observed data. The required density and velocity auto- and cross-covariance are assumed to follow a linear model of coregionalization (LCM). The recent development of nonlinear model of coregionalization could also be applied if needed. The kernel function for the gravity method is obtained by the closed form formulation. For ray tracing, we use the shortest-path methods (SPM) to calculate the operation matrix. The jointed inversion is performed on structured grid; however, it is possible to extend it to use unstructured grid. The method is tested on two synthetic models: a model consisting of two objects buried in a homogeneous background and a model with stochastic distribution of parameters. The results illustrate the capability of the method to improve the inverted model compared to the separate inverted models with either gravity

  5. Multivariate Statistical Analysis Applied in Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Jieling Zou

    2015-08-01

    Full Text Available This study applies multivariate statistical approaches to wine quality evaluation. With 27 red wine samples, four factors were identified out of 12 parameters by principal component analysis, explaining 89.06% of the total variance of data. As iterative weights calculated by the BP neural network revealed little difference from weights determined by information entropy method, the latter was chosen to measure the importance of indicators. Weighted cluster analysis performs well in classifying the sample group further into two sub-clusters. The second cluster of red wine samples, compared with its first, was lighter in color, tasted thinner and had fainter bouquet. Weighted TOPSIS method was used to evaluate the quality of wine in each sub-cluster. With scores obtained, each sub-cluster was divided into three grades. On the whole, the quality of lighter red wine was slightly better than the darker category. This study shows the necessity and usefulness of multivariate statistical techniques in both wine quality evaluation and parameter selection.

  6. Applying Conjoint Analysis to Study Attitudes of Thai Government Organisations

    Directory of Open Access Journals (Sweden)

    Natee Suriyanon

    2012-11-01

    Full Text Available This article presents the application of choice-based conjointanalysis to analyse the attitude of Thai government organisationstowards the restriction of the contractor’s right to claimcompensation for unfavourable effects from undesirable events.The analysis reveals that the organisations want to restrict only 6out of 14 types of the claiming rights that were studied. The rightthat they want to restrict most is the right to claim for additionaldirect costs due to force majeure. They are willing to pay between0.087% - 0.210% of the total project direct cost for restricting eachtype of contractor right. The total additional cost for restrictingall six types of rights that the organisations are willing to pay is0.882%. The last section of this article applies the knowledgegained from a choice based conjoint analysis experiment to theanalysis of the standard contract of the Thai government. Theanalysis reveals three types of rights where Thai governmentorganisations are willing to forego restrictions, but the presentstandard contract does not grant such rights.

  7. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    CERN Document Server

    von Hippel, Ted

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants a...

  8. Accuracy evaluation of different statistical and geostatistical censored data imputation approaches (Case study: Sari Gunay gold deposit

    Directory of Open Access Journals (Sweden)

    Babak Ghane

    2016-06-01

    Full Text Available Most of the geochemical datasets include missing data with different portions and this may cause a significant problem in geostatistical modeling or multivariate analysis of the data. Therefore, it is common to impute the missing data in most of geochemical studies. In this study, three approaches called half detection (HD, multiple imputation (MI, and the cosimulation based on Markov model 2 (MM2 are used to impute the censored data. According to the fact that the new datasets have to satisfy the original data underlying structure, the Multidimensional Scaling (MDS approach has been used to explore the validity of different imputation methods. Log-ratio transformation (alr transformation was performed to open the closed compositional data prior to applying the MDS method. Experiments showed that, based on the MDS approach, the MI and the MM2 could not satisfy the original underlying structure of the dataset as well as the HD approach. This is because these two mentioned approaches have produced values higher than the detection limit of the variables.

  9. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    Energy Technology Data Exchange (ETDEWEB)

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model that predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.

  10. Accounting for Transport Parameter Uncertainty in Geostatistical Groundwater Contaminant Release History Estimation

    Science.gov (United States)

    Ostrowski, J.; Shlomi, S.; Michalak, A.

    2007-12-01

    The process of estimating the release history of a contaminant in groundwater relies on coupling a limited number of concentration measurements with a groundwater flow and transport model in an inverse modeling framework. The information provided by available measurements is generally not sufficient to fully characterize the unknown release history; therefore, an accurate assessment of the estimation uncertainty is required. The modeler's level of confidence in the transport parameters, expressed as pdfs, can be incorporated into the inverse model to improve the accuracy of the release estimates. In this work, geostatistical inverse modeling is used in conjunction with Monte Carlo sampling of transport parameters to estimate groundwater contaminant release histories. Concentration non-negativity is enforced using a Gibbs sampling algorithm based on a truncated normal distribution. The method is applied to two one-dimensional test cases: a hypothetical dataset commonly used in validating contaminant source identification methods, and data collected from a tetrachloroethylene and trichloroethylene plume at the Dover Air Force Base in Delaware. The estimated release histories and associated uncertainties are compared to results from a geostatistical inverse model where uncertainty in transport parameters is ignored. Results show that the a posteriori uncertainty associated with the model that accounts for parameter uncertainty is higher, but that this model provides a more realistic representation of the release history based on available data. This modified inverse modeling technique has many applications, including assignment of liability in groundwater contamination cases, characterization of groundwater contamination, and model calibration.

  11. Demonstration of a geostatistical approach to physically consistent downscaling of climate modeling simulations

    KAUST Repository

    Jha, Sanjeev Kumar

    2013-01-01

    A downscaling approach based on multiple-point geostatistics (MPS) is presented. The key concept underlying MPS is to sample spatial patterns from within training images, which can then be used in characterizing the relationship between different variables across multiple scales. The approach is used here to downscale climate variables including skin surface temperature (TSK), soil moisture (SMOIS), and latent heat flux (LH). The performance of the approach is assessed by applying it to data derived from a regional climate model of the Murray-Darling basin in southeast Australia, using model outputs at two spatial resolutions of 50 and 10 km. The data used in this study cover the period from 1985 to 2006, with 1985 to 2005 used for generating the training images that define the relationships of the variables across the different spatial scales. Subsequently, the spatial distributions for the variables in the year 2006 are determined at 10 km resolution using the 50 km resolution data as input. The MPS geostatistical downscaling approach reproduces the spatial distribution of TSK, SMOIS, and LH at 10 km resolution with the correct spatial patterns over different seasons, while providing uncertainty estimates through the use of multiple realizations. The technique has the potential to not only bridge issues of spatial resolution in regional and global climate model simulations but also in feature sharpening in remote sensing applications through image fusion, filling gaps in spatial data, evaluating downscaled variables with available remote sensing images, and aggregating/disaggregating hydrological and groundwater variables for catchment studies.

  12. Correlation network analysis applied to complex biofilm communities.

    Directory of Open Access Journals (Sweden)

    Ana E Duran-Pinedo

    Full Text Available The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM, which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063. After two rounds of enrichment by a selected helper (Prevotella oris OT311 we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of

  13. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  14. Geostatistic in Reservoir Characterization: from estimation to simulation methods

    Directory of Open Access Journals (Sweden)

    Mata Lima, H.

    2005-12-01

    Full Text Available In this article objective have been made to reviews different geostatistical methods available to estimate and simulate petrophysical properties (porosity and permeability of the reservoir. Different geostatistical techniques that allow the combination of hard and soft data are taken into account and one refers the main reason to use the geostatistical simulation rather than estimation. Uncertainty in reservoir characterization due to variogram assumption, which is a strict mathematical equation and can leads to serious simplification on description of the natural processes or phenomena under consideration, is treated here. Mutiple-point geostatistics methods based on the concept of training images, suggested by Strebelle (2000 and Caers (2003 owing to variogram limitation to capture complex heterogeneity, is another subject presented. This article intends to provide a review of geostatistical methods to serve the interest of students and researchers.Este artículo presenta una revisión de diversos métodos geoestatísticos disponibles para estimar y para simular características petrofísicas (porosidad y permeabilidad de la formación geológica (roca depósito del petróleo. Se presentan diversas técnicas geostatísticas que permiten la combinación de datos hard y soft y se explica la razón principal para utilizar la simulación geoestatística en vez de estimación. También se explica la incertidumbre en la caracterización del depósito debido a la asunción del variogram. El hecho de que el variogram sea una simple ecuación matemática conduce a la simplificación seria en la descripción de los procesos o de los fenómenos naturales bajo consideración. Los «métodos geostatísticos del Multiplepoint » (Multiple-point geostatistics methods basados en el concepto de training images, sugerido por Strebelle (2000 y Caers (2003, debido a la limitación del variogram para capturar heterogeneidad compleja es otro tema presentado. Este

  15. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  16. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  17. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  18. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China

    Energy Technology Data Exchange (ETDEWEB)

    Zhong, Buqing; Liang, Tao, E-mail: liangt@igsnrr.ac.cn; Wang, Lingqing; Li, Kexin

    2014-08-15

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. - Highlights: • Conditional inference tree can identify variables controlling metal distribution. • Finite mixture distribution model can partition natural and anthropogenic sources. • Geostatistics with stochastic models

  19. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China

    International Nuclear Information System (INIS)

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. - Highlights: • Conditional inference tree can identify variables controlling metal distribution. • Finite mixture distribution model can partition natural and anthropogenic sources. • Geostatistics with stochastic models

  20. Reservoir Modeling Combining Geostatistics with Markov Chain Monte Carlo Inversion

    DEFF Research Database (Denmark)

    Zunino, Andrea; Lange, Katrine; Melnikova, Yulia;

    2014-01-01

    We present a study on the inversion of seismic reflection data generated from a synthetic reservoir model. Our aim is to invert directly for rock facies and porosity of the target reservoir zone. We solve this inverse problem using a Markov chain Monte Carlo (McMC) method to handle the nonlinear......, multi-step forward model (rock physics and seismology) and to provide realistic estimates of uncertainties. To generate realistic models which represent samples of the prior distribution, and to overcome the high computational demand, we reduce the search space utilizing an algorithm drawn from...... geostatistics. The geostatistical algorithm learns the multiple-point statistics from prototype models, then generates proposal models which are tested by a Metropolis sampler. The solution of the inverse problem is finally represented by a collection of reservoir models in terms of facies and porosity, which...

  1. 4th European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Carrera, Jesus; Gómez-Hernández, José

    2004-01-01

    The fourth edition of the European Conference on Geostatistics for Environmental Applications (geoENV IV) took place in Barcelona, November 27-29, 2002. As a proof that there is an increasing interest in environmental issues in the geostatistical community, the conference attracted over 100 participants, mostly Europeans (up to 10 European countries were represented), but also from other countries in the world. Only 46 contributions, selected out of around 100 submitted papers, were invited to be presented orally during the conference. Additionally 30 authors were invited to present their work in poster format during a special session. All oral and poster contributors were invited to submit their work to be considered for publication in this Kluwer series. All papers underwent a reviewing process, which consisted on two reviewers for oral presentations and one reviewer for posters. The book opens with one keynote paper by Philippe Naveau. It is followed by 40 papers that correspond to those presented orally d...

  2. Geostatistical Analysis on Plant Diversity of Ecological Forest in Xianju%椒江源头生态公益林区植物多样性的地统计学分析

    Institute of Scientific and Technical Information of China (English)

    王坚娅; 张汝忠; 张骏; 高洪娣; 袁位高

    2015-01-01

    Horizontal spatial pattern of plant richness and diversity index of ecological forest was analyzed by geostatistics in Xianju city, Jiaojiang River source, Zhejiang province. The results showed that plant richness and diversity index of tree, shrub and herb layer had spatial dependence. The plant richness of tree and herb layer performed moderate correlation in horizontal space, while that of shrub layer and diversity index of tree, shrub and herb layer performed significant correlation in horizontal space. For tree layer, maximum correlation distance of species richness and diversity index was 31.224 9 km and 10.498 6 km respectively, and the best theoretical model for species richness and diversity index semivariance function was nested spherical model and hyperbolic square model respectively. For shrub layer, maximum correlation distance of species richness and diversity index was 16.097 7 km and 19.188 9 km respectively, and the best theoretical model for species richness and diversity index was both nested exponential model. For herb layer, maximum correlation distance of species richness and diversity index was 50.343 7 km and 109.648 5 km respectively, and the best theoretical model for species richness and diversity index was nested spherical model and exponential model respectively.%根据地统计学原理研究椒江源头生态公益林区内植物物种丰富度和多样性指数的水平空间格局,结果表明:乔木、灌木、草本物种丰富度和多样性指数都具有空间依赖性,其中乔木、草本物种丰富度具有中等强度的水平空间相关性,灌木物种丰富度和乔木、灌木、草本多样性指数具有强烈的水平空间相关性;乔木物种丰富度的最大相关距离为31.2249 km、多样性指数的最大相关距离为10.4986 km,乔木物种丰富度半方差函数的最佳理论模型为嵌套的球形模型、多样性指数则为嵌套的双曲平方模型;灌木物种丰富

  3. Proximal soil sensors and geostatistical tools in precision agriculture applications

    OpenAIRE

    Shaddad, Sameh

    2014-01-01

    Recognition of spatial variability is very important in precision agriculture applications. The use of proximal soil sensors and geostatistical techniques is highly recommended worldwide to detect spatial variation not only in fields but also within-field (micro-scale). This study involves, as a first step, the use of visible and near infrared (vis-NIR) spectroscopy to estimate soil key properties (6) and obtain high resolution maps that allow us to model the spatial variability in the soil. ...

  4. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  5. A reservoir skeleton-based multiple point geostatistics method

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Traditional stochastic reservoir modeling,including object-based and pixel-based methods,cannot solve the problem of reproducing continuous and curvilinear reservoir objects. The paper first dives into the various stochastic modeling methods and extracts their merits,then proposes the skeleton-based multiple point geostatistics(SMPS) for the fluvial reservoir. The core idea is using the skeletons of reservoir objects to restrict the selection of data patterns. The skeleton-based multiple point geostatistics consists of two steps. First,predicting the channel skeleton(namely,channel centerline) by using the method in object-based modeling. The paper proposes a new method of search window to predict the skeleton. Then forecasting the distributions of reservoir objects using multiple point geostatistics with the restriction of channel skeleton. By the restriction of channel centerline,the selection of data events will be more reasonable and the realization will be achieved more really. The checks by the conceptual model and the real reservoir show that SMPS is much better than Sisim(sequential indicator simulation) ,Snesim(Single Normal Equation Simulation) and Simpat(simulation with patterns) in building the fluvial reservoir model. This new method will contribute to both the theoretical research of stochastic modeling and the oilfield developments of constructing highly precise reservoir geological models.

  6. Simulation of Soil Macropore Networks Using Multi-Point Geostatistics

    Science.gov (United States)

    Luo, L.; Lin, H.; Singha, K.

    2006-12-01

    Two-point correlation functions have been used to quantitatively evaluate the spatial variability of soil structure but cannot characterize its specific geometry. Multi-point (MP) geostatistics can be used to consider both spatial correlation and the specific shape of objects. The algorithm, SNESIM, developed by Strebelle (2002) was used to simulate complex geological patterns, reflecting different scales of variability and types of heterogeneity. Soil macropores, especially earthworm burrows and root channels, are critical to preferential flow and transport in soils. Accurate simulation of soil macropore network can allow us better simulate soil hydraulic properties and address scaling issues of structured soils. However, little work has been done to simulate soil macropore network using MP geostatistics. 3D soil macropore network of an agricultural soil in Pennsylvania, Hagerstown silt loam, was extracted from the spatially exhaustive data collected with Micro Computing Tomography. 3D training image was scanned by 3D template to obtain the 3D patterns. Permeability of the simulated macropore network was calculated using Lattice-Boltzmann method. SNESIM was able to simulate different types of macropores, the earthworm burrows and inter-aggregate pores. However, SNESIM requires that training images must have a stationary character, which may limit its ability to simulate soil macropore network. SNESIM is still very computationally consuming for the 3D structure simulation. While computationally expensive, MP geostatistics has great potential for simulating 3D soil macropore network, which is useful to understand and predict the hydraulic behavior of structured soils.

  7. Fractal and geostatistical methods for modeling of a fracture network

    International Nuclear Information System (INIS)

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D

  8. Fractal and geostatistical methods for modeling of a fracture network

    Energy Technology Data Exchange (ETDEWEB)

    Chiles, J.P.

    1988-08-01

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D.

  9. Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.

    Science.gov (United States)

    Sukop, Michael C; Cunningham, Kevin J

    2016-03-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. PMID:26174850

  10. Geostatistic in the evaluation of a granitic formation as a nuclear repository site selection method

    International Nuclear Information System (INIS)

    As a result of several preliminary investigations, an area of about 12 km2 was preselected at Sierra del Medio in northern Patagonia (Chubut Province) for the emplacement of a nuclear spent fuel repository. In the development of this detailed research were included the following technical items: remote sensing imagery analysis, vertical photointerpretation, geophysical and geological surveys by field crews, petrographic macro and micro samples classification, regional geomorphological and hydrogeological surveys, medium scale photogrammetric mapping, large scale field mapping, preliminary borehole drilling up to 280 meters deep and joints-discontinuities evaluations, chemical water and rock sampling, tectonics and modern volcanic activity, with the most important attention given to seismological prevention research. A joint census programme was developed through a regular sampling grid with systematically increased spatial density; obtained values were extrapolated by means of geostatistically supported methods. This paper presents the geostatistical evaluation of surface fracture rock behaviour at the preliminary selected site and the selection of a ''less fractured area'' of about 600x1000 meters. Furthermore, the location of four 700 m deep boreholes has been proposed as final repository-depth data gathering

  11. Geostatistical borehole image-based mapping of karst-carbonate aquifer pores

    Science.gov (United States)

    Michael Sukop,; Cunningham, Kevin J.

    2016-01-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.

  12. Can Artificial Neural Networks be Applied in Seismic Predicition? Preliminary Analysis Applying Radial Topology. Case: Mexico

    CERN Document Server

    Mota-Hernandez, Cinthya; Alvarado-Corona, Rafael

    2014-01-01

    Tectonic earthquakes of high magnitude can cause considerable losses in terms of human lives, economic and infrastructure, among others. According to an evaluation published by the U.S. Geological Survey, 30 is the number of earthquakes which have greatly impacted Mexico from the end of the XIX century to this one. Based upon data from the National Seismological Service, on the period between January 1, 2006 and May 1, 2013 there have occurred 5,826 earthquakes which magnitude has been greater than 4.0 degrees on the Richter magnitude scale (25.54% of the total of earthquakes registered on the national territory), being the Pacific Plate and the Cocos Plate the most important ones. This document describes the development of an Artificial Neural Network (ANN) based on the radial topology which seeks to generate a prediction with an error margin lower than 20% which can inform about the probability of a future earthquake one of the main questions is: can artificial neural networks be applied in seismic forecast...

  13. Granger Causality Between Exports, Imports and GDP in France: Evidance from Using Geostatistical Models

    OpenAIRE

    Arshia Amiri; Ulf-G Gerdtham

    2012-01-01

    This paper introduces a new way of investigating linear and nonlinear Granger causality between exports, imports and economic growth in France over the period 1961_2006 with using geostatistical models (kiriging and Inverse distance weighting). Geostatistical methods are the ordinary methods for forecasting the locatins and making map in water engineerig, environment, environmental pollution, mining, ecology, geology and geography. Although, this is the first time which geostatistics knowledg...

  14. Multi-Criteria GIS Methodology Focused on the Location of Optimal Places for Small Hydro Power Via Hydrological and Geostatistic Aplications; Metodologia SIG Multicriterio Enfocada a la Localizacion de Enclaves Optimos para Centrales de Minihidroelectricas mediante Aplicaciones Hidrologicas y Geoestadisticas

    Energy Technology Data Exchange (ETDEWEB)

    Paz, C. de la

    2013-02-01

    The main objective of this research is the development of a location methodology for sitting optimization of small hydro power (SHP) centrals. In order of achieve this goal, a Multi-Criteria Evaluation (MCE) methodology implemented through the use of tools in a GIS environment: Spatial Analysis, Geostatistic Analysis, and Hydrology have been developed. This methodology includes two different models based on the same MCE process. The substantial difference of both models is in the input data and the tools applied to estimate the energy resource and the principal factor of the methodology (caudal or accumulated flow). The first model is generated from caudal data obtained in the study area (El Bierzo), and the second one from pluviometric data and Digital Terrain Model (DTM). Both models include viability maps with greater ability areas to locate SHP facilities. As an additional objective, the study allows contrasting the results of the two developed models to evaluate their similarity. (Author)

  15. Geostatistical regionalization of low-flow indices: PSBI and Top-Kriging

    Directory of Open Access Journals (Sweden)

    S. Castiglioni

    2010-09-01

    Full Text Available Recent studies highlight that geostatistical interpolation, which has been originally developed for the spatial interpolation of point data, can be effectively applied to the problem of regionalization of hydrometric information. This study compares two innovative geostatistical approaches for the prediction of low-flows in ungauged basins. The first one, named Physiographic-Space Based Interpolation (PSBI, performs the spatial interpolation of the desired streamflow index (e.g., annual streamflow, low-flow index, flood quantile, etc. in the space of catchment descriptors. The second technique, named Topological kriging or Top-Kriging, predicts the variable of interest along river networks taking both the area and nested nature of catchments into account. PSBI and Top-Kriging are applied for the regionalization of Q355 (i.e., the streamflow that is equalled or exceeded 355 days in a year, on average over a broad geographical region in central Italy, which contains 51 gauged catchments. Both techniques are cross-validated through a leave-one-out procedure at all available gauges and applied to a subregion to produce a continuous estimation of Q355 along the river network extracted from a 90 m DEM. The results of the study show that Top-Kriging and PSBI present complementary features and have comparable performances (Nash-Sutcliffe efficiencies in cross-validation of 0.89 and 0.83, respectively. Both techniques provide plausible and accurate predictions of Q355 in ungauged basins and represent promising opportunities for regionalization of low-flows.

  16. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  17. Video analysis applied to volleyball didactics to improve sport skills

    OpenAIRE

    Raiola, Gaetano; Parisi, Fabio; Giugno, Ylenia; Di Tore, Pio Alfredo

    2013-01-01

    The feedback method is increasingly used in learning new skills and improving performance. "Recent research, however, showed that the most objective and quantitative feedback is, theº greater its effect on performance". The video analysis, which is the analysis of sports performance by watching the video, is used primarily for use in the quantitative performance of athletes through the notational analysis. It may be useful to combine the quantitative and qualitative analysis of the single ges...

  18. Biomass changes and geostatistical analysis of Xiao Hinggan region in recent 30 years%近30年来小兴安岭地区生物量变化及地统计分析

    Institute of Scientific and Technical Information of China (English)

    毛学刚; 李明泽; 范文义; 姜欢欢

    2011-01-01

    Based on remote sensing data from three periods of 1980s, 1990s, and the period after 2000 as well as the data of plots in forest resource inventory over the same periods, forest biomass of Xiao Hinggan region was estimated by using the remote sensing information model. With the combination of GIS and geo-statistics, this paper studies the temporal changes in forest biomass, spatial autocorrelation and heterogeneity of Xiao Hinggan region in the three periods of 1980s, 1990s, and the period after 2000. Results indicated that the overall biomass presented fluctuation change in the research area from the 1980s to the 2000s. With relatively low biological value, low-grade biomass was dominant in the 1980s, and there was contiguous distribution of low-value biomass, with high degree of spatial autocorrelation. However, the random factors of medium and higher biomass increased, indicating the man-made interference degree continuously strengthened. In the 1990s the main advantages biomass in the study area was medium biomass, which evolved from dominant low-grade biomass in the 1980s. The changes in the 10 years showed that overall biomass tended to recover. For the data were mainly concentrated in the late 1990s when the Natural Forest Protection Project (NFPP) had been launched that made the forest status towards a good direction, the overall biological value was increased. After 2000 the spatial autocorrelation of overall biomass in the research area was not high, but medium and higher biomass was similar and changed evenly in every direction. Median biomass was distributed widely, while high-value biomass was of small patches with fragmentation, and the spatial variability caused by random factors such as man-made disturbance or the factor of spatial autocorrelation was just similar, and appeared a stability trend.%以20世纪80年代、90年代、2000年以后三个时期的遥感数据和同期的森林资源清查样地数据为基础,应用遥感信息模型的

  19. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  20. An Analysis of the Economy Principle Applied in Cyber Language

    Institute of Scientific and Technical Information of China (English)

    肖钰敏

    2015-01-01

    With the development of network technology,cyber language,a new social dialect,is widely used in our life.The author analyzes how the economy principle is applied in cyber language from three aspects—word-formation,syntax and non-linguistic symbol.And the author collects,summarizes and analyzes the relevant language materials to prove the economy principle’s real existence in chat room and the reason why the economy principle is applied widely in cyber space.

  1. Integration of dynamical data in a geostatistical model of reservoir; Integration des donnees dynamiques dans un modele geostatistique de reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Costa Reis, L.

    2001-01-01

    We have developed in this thesis a methodology of integrated characterization of heterogeneous reservoirs, from geologic modeling to history matching. This methodology is applied to the reservoir PBR, situated in Campos Basin, offshore Brazil, which has been producing since June 1979. This work is an extension of two other thesis concerning geologic and geostatistical modeling of the reservoir PBR from well data and seismic information. We extended the geostatistical litho-type model to the whole reservoir by using a particular approach of the non-stationary truncated Gaussian simulation method. This approach facilitated the application of the gradual deformation method to history matching. The main stages of the methodology for dynamic data integration in a geostatistical reservoir model are presented. We constructed a reservoir model and the initial difficulties in the history matching led us to modify some choices in the geological, geostatistical and flow models. These difficulties show the importance of dynamic data integration in reservoir modeling. The petrophysical property assignment within the litho-types was done by using well test data. We used an inversion procedure to evaluate the petrophysical parameters of the litho-types. The up-scaling is a necessary stage to reduce the flow simulation time. We compared several up-scaling methods and we show that the passage from the fine geostatistical model to the coarse flow model should be done very carefully. The choice of the fitting parameter depends on the objective of the study. In the case of the reservoir PBR, where water is injected in order to improve the oil recovery, the water rate of the producing wells is directly related to the reservoir heterogeneity. Thus, the water rate was chosen as the fitting parameter. We obtained significant improvements in the history matching of the reservoir PBR. First, by using a method we have proposed, called patchwork. This method allows us to built a coherent

  2. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  3. An applied general equilibrium model for Dutch agribusiness policy analysis.

    NARCIS (Netherlands)

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly

  4. System Analysis Applying to Talent Resource Development Research

    Institute of Scientific and Technical Information of China (English)

    WANG Peng-tao; ZHENG Gang

    2001-01-01

    In the development research of talent resource, the most important of talent resource forecast and optimization is the structure of talent resource, requirement number and talent quality. The article establish factor reconstruction analysis forecast and talent quality model on the method: system reconstruction analysis and ensure most effective factor level in system, which is presented by G. J. Klirti, B.Jonesque. And performing dynamic analysis of example ration.

  5. Geostatistical and Fractal Characteristics of Soil Moisture Patterns from Plot to Catchment Scale Datasets

    Science.gov (United States)

    Korres, Wolfgang; Reichenau, Tim G.; Fiener, Peter; Koyama, Christian N.; Bogena, Heye R.; Cornelissen, Thomas; Baatz, Roland; Herbst, Michael; Diekkrüger, Bernd; Vereecken, Harry; Schneider, Karl

    2016-04-01

    Soil moisture and its spatio-temporal pattern is a key variable in hydrology, meteorology and agriculture. The aim of the current study is to analyze spatio-temporal soil moisture patterns of 9 datasets from the Rur catchment (Western Germany) with a total area of 2364 km², consisting of a low mountain range (forest and grassland) and a loess plain dominated by arable land. Data was acquired across a variety of land use types, on different spatial scales (plot to mesoscale catchment) and with different methods (field measurements, remote sensing, and modelling). All datasets were analyzed using the same methodology. In a geostatistical analysis sill and range of the theoretical variogram were inferred. Based on this analysis, three groups of datasets with similar characteristics in the autocorrelation structure were identified: (i) modelled and measured datasets from a forest sub-catchment (influenced by soil properties and topography), (ii) remotely sensed datasets from the cropped part of the total catchment (influenced by the land-use structure of the cropped area), and (iii) modelled datasets from the cropped part of the Rur catchment (influenced by large scale variability of soil properties). A fractal analysis revealed that soil moisture patterns of all datasets show a multi-fractal behavior (varying fractal dimensions, patterns are only self-similar over certain ranges of scales), with at least one scale break and generally high fractal dimensions (high spatial variability). Corresponding scale breaks were found in various datasets and the factors explaining these scale breaks are consistent with the findings of the geostatistical analysis. The joined analysis of the different datasets showed that small differences in soil moisture dynamics, especially at maximum porosity and wilting point in the soils, can have a large influence on the soil moisture patterns and their autocorrelation structure.

  6. Map on predicted deposition of Cs-137 in Spanish soils from geostatistical analyses

    International Nuclear Information System (INIS)

    The knowledge of the distribution of 137Cs deposition over Spanish mainland soils along with the geographical, physical and morphological terrain information enable us to know the 137Cs background content in soil. This could be useful as a tool in a hypothetical situation of an accident involving a radioactive discharge or in soil erosion studies. A Geographic Information System (GIS) would allow the gathering of all the mentioned information. In this work, gamma measurements of 137Cs on 34 Spanish mainland soils, rainfall data taken from 778 weather stations, soil types and geographical and physical terrain information were input into a GIS. Geostatistical techniques were applied to interpolate values of 137Cs activity at unsampled places, obtaining prediction maps of 137Cs deposition. Up to now, geostatistical methods have been used to model spatial continuity of data. Through semivariance and cross-covariance functions the spatial correlation of such data can be studied and described. Ordinary and simple kriging techniques were carried out to map spatial patterns of 137Cs deposition, and ordinary and simple co-kriging were used to improve the prediction map obtained through a second related variable: namely the rainfall. To choose the best prediction map of 137Cs deposition, the spatial dependence of the variable, the correlation coefficient and the prediction errors were evaluated using the different models previously mentioned. The best result for 137Cs deposition map was obtained when applying the co-kriging techniques. - Highlights: ► Implementation of 137Cs activity data, in Spanish soils, in a GIS. ► Prediction models were performed of Cs-137 fallout with kriging techniques. ► More accurate prediction surfaces were obtained using cokriging techniques. ► Rainfall is the second variable used to cokriging technique.

  7. Geostatistical Evaluation of Spring Water Quality in an Urbanizing Carbonate Aquifer

    Science.gov (United States)

    McGinty, A.; Welty, C.

    2003-04-01

    As part of an investigation of the impacts of urbanization on the hydrology and ecology of Valley Creek watershed near Philadelphia, Pennsylvania, we have analyzed the chemical composition of 110 springs to assess the relative influence of geology and anthropogenic activities on water quality. The 60 km^2 watershed is underlain by productive fractured rock aquifers composed of Cambrian and Ordovician carbonate rocks in the central valley and Cambrian crystalline and siliciclastic rocks (quartzite and phyllite) in the north and south hills that border the valley. All tributaries of the surface water system originate in the crystalline and siliciclastic hills. The watershed is covered by 17% impervious area and contains 6 major hazardous waste sites, one active quarrying operation and one golf course; 25% of the area utilizes septic systems for sewage disposal. We identified 172 springs, 110 of which had measurable flow rates ranging from 0.002 to 5 l/s. The mapped surficial geology appears as an anisotropic pattern, with long bands of rock formations paralleling the geographic orientation of the valley. Mapped development appears as a more isotropic pattern, characterized by isolated patches of land use that are not coincident with the evident geologic pattern. Superimposed upon these characteristics is a dense array of depressions and shallow sinkholes in the carbonate rocks, and a system of major faults at several formation contacts. We used indicator geostatistics to quantitatively characterize the spatial extent of the major geologic formations and patterns of land use. Maximum correlation scales for the rock types corresponded with strike direction and ranged from 1000 to 3000 m. Anisotropy ratios ranged from 2 to 4. Land-use correlation scales were generally smaller (200 to 500 m) with anisotropy ratios of around 1.2, i.e., nearly isotropic as predicted. Geostatistical analysis of spring water quality parameters related to geology (pH, specific conductance

  8. Analysis of OFDM Applied to Powerline High Speed Digital Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUANG Jian; YANG Gong-xu

    2003-01-01

    The low voltage powerline is becoming a powerful solution to home network, building automation, and internet access as a result of its wide distribution, easy access and little maintenance. The character of powerline channel is very complicated because it is an open net. This article analysed the character of the powerline channel,introduced the basics of OFDM(Orthogonal Frequency Division Multiplexing), and studied the OFDM applied into powerline high speed digital communication.

  9. An applied general equilibrium model for Dutch agribusiness policy analysis.

    OpenAIRE

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly general and could be used to analyse a great variety of agricultural policy changes. However, generality requires that the model should be adapted and extended for special research questions. This...

  10. Multisectorial models applied to the environment: an analysis for catalonia

    OpenAIRE

    Pié Dols, Laia

    2010-01-01

    The objective of this doctoral thesis is to apply different multisectorial models available to analyse the impact that would had on the Catalan economy as a result of the introduction of policies designed to reduce emissions of greenhouse effect gases and save energy, and also at the same time to improve the environmental competitiveness of both individual companies and the economy as a whole. For the purposes of this thesis I have analysed the six greenhouse gases that are regulated by the K...

  11. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  12. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik;

    2010-01-01

    the geology of e.g. a contaminated site, it is not always possible to gather enough information to build a representative geological model. Mapping in analogue geological settings and applying geostatistical tools to simulate spatial variability of heterogeneities can improve ordinary geological models...... (TPROGS) of alternating geological facies. The second method, multiple-point statistics, uses training images to estimate the conditional probability of sand-lenses at a certain location. Both methods respect field observations such as local stratigraphy, however, only the multiple-point statistics can...

  13. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  14. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  15. Spatial continuity measures for probabilistic and deterministic geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Isaaks, E.H.; Srivastava, R.M.

    1988-05-01

    Geostatistics has traditionally used a probabilistic framework, one in which expected values or ensemble averages are of primary importance. The less familiar deterministic framework views geostatistical problems in terms of spatial integrals. This paper outlines the two frameworks and examines the issue of which spatial continuity measure, the covariance C(h) or the variogram ..sigma..(h), is appropriate for each framework. Although C(h) and ..sigma..(h) were defined originally in terms of spatial integrals, the convenience of probabilistic notation made the expected value definitions more common. These now classical expected value definitions entail a linear relationship between C(h) and ..sigma..(h); the spatial integral definitions do not. In a probabilistic framework, where available sample information is extrapolated to domains other than the one which was sampled, the expected value definitions are appropriate; furthermore, within a probabilistic framework, reasons exist for preferring the variogram to the covariance function. In a deterministic framework, where available sample information is interpolated within the same domain, the spatial integral definitions are appropriate and no reasons are known for preferring the variogram. A case study on a Wiener-Levy process demonstrates differences between the two frameworks and shows that, for most estimation problems, the deterministic viewpoint is more appropriate. Several case studies on real data sets reveal that the sample covariance function reflects the character of spatial continuity better than the sample variogram. From both theoretical and practical considerations, clearly for most geostatistical problems, direct estimation of the covariance is better than the traditional variogram approach.

  16. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    International Nuclear Information System (INIS)

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data

  17. Applying real options analysis to assess cleaner energy development strategies

    International Nuclear Information System (INIS)

    The energy industry, accounts for the largest portion of CO2 emissions, is facing the issue of compliance with the national clean energy policy. The methodology for evaluating the energy mix policy is crucial because of the characteristics of lead time embedded with the power generation facilities investment and the uncertainty of future electricity demand. In this paper, a modified binomial model based on sequential compound options, which may account for the lead time and uncertainty as a whole is established, and a numerical example on evaluating the optional strategies and the strategic value of the cleaner energy policy is also presented. It is found that the optimal decision at some nodes in the binomial tree is path dependent, which is different from the standard sequential compound option model with lead time or time lag concept. The proposed modified binomial sequential compound real options model can be generalized and extensively applied to solve the general decision problems that deal with the long lead time of many government policies as well as capital intensive investments. - Highlights: → Introducing a flexible strategic management approach for government policy making. → Developing a modified binomial real options model based on sequential compound options. → Proposing an innovative model for managing the long term policy with lead time. → Applying to evaluate the options of various scenarios of cleaner energy strategies.

  18. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  19. The colour analysis method applied to homogeneous rocks

    Science.gov (United States)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  20. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  1. Applying an Activity System to Online Collaborative Group Work Analysis

    Science.gov (United States)

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  2. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    Science.gov (United States)

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  3. Applying Adult Learning Theory through a Character Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  4. Thermal Analysis Applied to Verapamil Hydrochloride Characterization in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    Maria Irene Yoshida

    2010-04-01

    Full Text Available Thermogravimetry (TG and differential scanning calorimetry (DSC are useful techniques that have been successfully applied in the pharmaceutical industry to reveal important information regarding the physicochemical properties of drug and excipient molecules such as polymorphism, stability, purity, formulation compatibility among others. Verapamil hydrochloride shows thermal stability up to 180 °C and melts at 146 °C, followed by total degradation. The drug is compatible with all the excipients evaluated. The drug showed degradation when subjected to oxidizing conditions, suggesting that the degradation product is 3,4-dimethoxybenzoic acid derived from alkyl side chain oxidation. Verapamil hydrochloride does not present the phenomenon of polymorphism under the conditions evaluated. Assessing the drug degradation kinetics, the drug had a shelf life (t90 of 56.7 years and a pharmaceutical formulation showed t90 of 6.8 years showing their high stability.

  5. Geostatistical Characteristics of the Structure of Spatial Variation of Electrical Power in the National 110 KV Network Including Results of Variogram Model Components Filtering

    OpenAIRE

    Barbara Namysłowska-Wilczyńska; Artur Wilczyński

    2015-01-01

    The paper provides results of analysing the superficial variability of electrical power using two geostatistical methods – lognormal kriging and ordinary kriging. The research work was to provide detailed characterization and identification of the electrical load variability structure at nodes of a 110 kV network over the whole territory of Poland having been analyzed on the basis of results from kriging techniques applied. The paper proposes the methodology using two techniques of mode...

  6. Geostatistical modelling of carbon monoxide levels in Khartoum State (Sudan) - GIS pilot based study

    International Nuclear Information System (INIS)

    The objective of this study is to develop a digital GIS model; that can evaluate, predict and visualize carbon monoxide (CO) levels in Khartoum state. To achieve this aim, sample data had been collected, processed and managed to generate a dynamic GIS model of carbon monoxide levels in the study area. Parametric data collected from the field and analysis carried throughout this study show that (CO) emissions were lower than the allowable ambient air quality standards released by National Environment Protection Council (NEPC-USA) for 1998. However, this pilot study has found emissions of (CO) in Omdurman city were the highest. This pilot study shows that GIS and geostatistical modeling can be used as a powerful tool to produce maps of exposure. (authors)

  7. LAMQS analysis applied to ancient Egyptian bronze coins

    International Nuclear Information System (INIS)

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  8. Nested sampling applied in Bayesian room-acoustics decay analysis.

    Science.gov (United States)

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  9. LAMQS analysis applied to ancient Egyptian bronze coins

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: lorenzo.torrisi@unime.i [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caridi, F.; Giuffrida, L.; Torrisi, A. [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Mondio, G.; Serafino, T. [Dipartimento di Fisica della Materia ed Ingegneria Elettronica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caltabiano, M.; Castrizio, E.D. [Dipartimento di Lettere e Filosofia dell' Universita di Messina, Polo Universitario dell' Annunziata, 98168 Messina (Italy); Paniz, E.; Salici, A. [Carabinieri, Reparto Investigazioni Scientifiche, S.S. 114, Km. 6, 400 Tremestieri, Messina (Italy)

    2010-05-15

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  10. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    Science.gov (United States)

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  11. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    Science.gov (United States)

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  12. Applying ABC analysis to the Navy's inventory management system

    OpenAIRE

    May, Benjamin

    2014-01-01

    Approved for public release; distribution is unlimited ABC Analysis is an inventory categorization technique used to classify and prioritize inventory items in an effort to better allocate business resources. A items are defined as the inventory items considered extremely important to the business, requiring strict oversight and control. B items are important to the business, but don’t require the tight controls and oversight required of the A items. C items are marginally important to the...

  13. Consumers' Demand for Pork Quality: Applying Semantic Network Analysis

    OpenAIRE

    Carola Grebitus; Maike Bruhn

    2006-01-01

    Consideration of consumers' demand for food quality entails several aspects. Quality itself is a complex and dynamic concept, and constantly evolving technical progress may cause changes in consumers' judgment of quality. To improve our understanding of the factors influencing the demand for quality, food quality must be defined and measured from the consumer's perspective (Cardello, 1995). The present analysis addresses the issue of food quality, focusing on pork—the food that respondents ...

  14. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  15. Weighted gene coexpression network analysis strategies applied to mouse weight

    OpenAIRE

    Fuller, Tova F; Ghazalpour, Anatole; Aten, Jason E.; Drake, Thomas A; Lusis, Aldons J.; Horvath, Steve

    2007-01-01

    Systems-oriented genetic approaches that incorporate gene expression and genotype data are valuable in the quest for genetic regulatory loci underlying complex traits. Gene coexpression network analysis lends itself to identification of entire groups of differentially regulated genes—a highly relevant endeavor in finding the underpinnings of complex traits that are, by definition, polygenic in nature. Here we describe one such approach based on liver gene expression and genotype data from an ...

  16. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  17. The colour analysis method applied to homogeneous rocks

    Directory of Open Access Journals (Sweden)

    Halász Amadé

    2015-12-01

    Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  18. Cross-covariance functions for multivariate geostatistics

    KAUST Repository

    Genton, Marc G.

    2015-05-01

    Continuously indexed datasets with multiple variables have become ubiquitous in the geophysical, ecological, environmental and climate sciences, and pose substantial analysis challenges to scientists and statisticians. For many years, scientists developed models that aimed at capturing the spatial behavior for an individual process; only within the last few decades has it become commonplace to model multiple processes jointly. The key difficulty is in specifying the cross-covariance function, that is, the function responsible for the relationship between distinct variables. Indeed, these cross-covariance functions must be chosen to be consistent with marginal covariance functions in such a way that the second-order structure always yields a nonnegative definite covariance matrix. We review the main approaches to building cross-covariance models, including the linear model of coregionalization, convolution methods, the multivariate Matérn and nonstationary and space-time extensions of these among others. We additionally cover specialized constructions, including those designed for asymmetry, compact support and spherical domains, with a review of physics-constrained models. We illustrate select models on a bivariate regional climate model output example for temperature and pressure, along with a bivariate minimum and maximum temperature observational dataset; we compare models by likelihood value as well as via cross-validation co-kriging studies. The article closes with a discussion of unsolved problems. © Institute of Mathematical Statistics, 2015.

  19. Geostatistics and cost-effective environmental remediation

    International Nuclear Information System (INIS)

    Numerous sites within the U.S. Department of Energy (DOE) complex have been contaminated with various radioactive and hazardous materials by defense-related activities during the post-World War II era. The perception is that characterization and remediation of these contaminated sites will be too costly using currently available technology. Consequently, the DOE Office of Technology Development has funded development of a number of alternative processes for characterizing and remediating these sites. The former Feed-Materials Processing Center near Fernald, Ohio (USA), was selected for demonstrating several innovative technologies. Contamination at the Fernald site consists principally of particulate uranium and derivative compounds in surficial soil. A field-characterization demonstration program was conducted during the summer of 1994 specifically to demonstrate the relative economic performance of seven proposed advanced-characterization tools for measuring uranium activity of in-situ soils. These innovative measurement technologies are principally radiation detectors of varied designs. Four industry-standard measurement technologies, including conventional, regulatory-agency-accepted soil sampling followed by laboratory geochemical analysis, were also demonstrated during the program for comparative purposes. A risk-based economic-decision model has been used to evaluate the performance of these alternative characterization tools. The decision model computes the dollar value of an objective function for each of the different characterization approaches. The methodology not only can assist site operators to choose among engineering alternatives for site characterization and/or remediation, but also can provide an objective and quantitative basis for decisions with respect to the completeness of site characterization

  20. A Multifactorial Analysis of Reconstruction Methods Applied After Total Gastrectomy

    Directory of Open Access Journals (Sweden)

    Oktay Büyükaşık

    2010-12-01

    Full Text Available Aim: The aim of this study was to evaluate the reconstruction methods applied after total gastrectomy in terms of postoperative symptomology and nutrition. Methods: This retrospective study was conducted on 31 patients who underwent total gastrectomy due to gastric cancer in 2. Clinic of General Surgery, SSK Ankara Training Hospital. 6 different reconstruction methods were used and analyzed in terms of age, sex and postoperative complications. One from esophagus and two biopsy specimens from jejunum were taken through upper gastrointestinal endoscopy from all cases, and late period morphological and microbiological changes were examined. Postoperative weight change, dumping symptoms, reflux esophagitis, solid/liquid dysphagia, early satiety, postprandial pain, diarrhea and anorexia were assessed. Results: Of 31 patients,18 were males and 13 females; the youngest one was 33 years old, while the oldest- 69 years old. It was found that reconstruction without pouch was performed in 22 cases and with pouch in 9 cases. Early satiety, postprandial pain, dumping symptoms, diarrhea and anemia were found most commonly in cases with reconstruction without pouch. The rate of bacterial colonization of the jejunal mucosa was identical in both groups. Reflux esophagitis was most commonly seen in omega esophagojejunostomy (EJ, while the least-in Roux-en-Y, Tooley and Tanner 19 EJ. Conclusion: Reconstruction with pouch performed after total gastrectomy is still a preferable method. (The Medical Bulletin of Haseki 2010; 48:126-31

  1. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  2. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  3. Applying temporal network analysis to the venture capital market

    Science.gov (United States)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  4. Risk Analysis of the applied RFID system : Project Stolpen

    OpenAIRE

    Grunzke, Richard

    2007-01-01

    This thesis will be a risk analysis of a RFID-system for a logistical application. The system works as follows: Around Karlstad in Sweden there are three new weighing machines for lorries. The load weight will be measured for the police to control overweight and for logistical reasons such as issuing invoices and optimising the supply chain. The lorries do not have to stop to be weighed. They have to drive slowly over the weighing machine, so the loss of time is minimal. The lorries will be i...

  5. Methods of economic analysis applied to fusion research. Final report

    International Nuclear Information System (INIS)

    In this and previous efforts ECON has provided economic assessment of a fusion research program. This phase of study focused on two tasks, the first concerned with the economics of fusion in an economy that relies heavily upon synthetic fuels, and the second concerned with the overall economic effects of pursuing soft energy technologies instead of hard technologies. This report is organized in two parts, the first entitled An Economic Analysis of Coproduction of Fusion-Electric Energy and Other Products, and the second entitled Arguments Associated with the Choice of Potential Energy Futures

  6. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  7. Status and Role of Geostatistics in the Education of Geographic Information System%地统计学在地理信息系统教学中的地位与作用

    Institute of Scientific and Technical Information of China (English)

    魏义长; 赵东保; 李小根; 张富; 杨成杰; 姚志宏

    2011-01-01

    地统计分析是空间分析的重要技术手段之一,当前不少地理信息系统(GIS)专业的学生对地统计分析课程不够了解和重视.为促进GIS专业学生对地统计分析课程的认识,推动地统计学的发展,作者根据多年从事地统计分析教学与科研的经验,并通过查阅大量国内外文献,从地统计分析与传统统计学,地理信息系统的区别与联系出发,详细地分析了地统计分析课程在地理信息系统教学中的地位和作用,深入探讨了学习地统计分析课程的方法,最后对地统计分析课程的发展进行了展望.%Ceostatistical analysis is one of very important technical methods in spatial analysis, but many students majoring in geographic information system (CIS) did not attach importance to learning the theory and technique of geostatistics,even more a few students know nothing of it. Therefore,according to the experience of the authors in teaching and researching by using geostatistics,and by consulting all literatures on geostatistics, the paper first discussed the difference and relationship of geostatistics with classical statistics and GIS,and analyzed the status and role of geostatistics in the education of geographic information system,and then investigated the methods of learning geostatistics, finally, took a glance into the future of geostatistics. So that, the authors expect the paper should increase the recognizing to geostatistics in students majoring in GIS, and advance the developing of geostatistics.

  8. Regional flow duration curves: Geostatistical techniques versus multivariate regression

    Science.gov (United States)

    Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.

    2016-01-01

    A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.

  9. Dynamical systems analysis applied to working memory data.

    Science.gov (United States)

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  10. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  11. Dynamical Systems Analysis Applied to Working Memory Data

    Directory of Open Access Journals (Sweden)

    Fidan eGasimova

    2014-07-01

    Full Text Available In the present paper we investigate weekly fluctuations in the working memory capacity (WMC assessed over a period of two years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure’s performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  12. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  13. Applying Skinner's analysis of verbal behavior to persons with dementia.

    Science.gov (United States)

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented. PMID:21292058

  14. Sensitivity Analysis Applied in Design of Low Energy Office Building

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik

    2008-01-01

    Building performance can be expressed by different indicators as primary energy use, environmental load and/or the indoor environmental quality and a building performance simulation can provide the decision maker with a quantitative measure of the extent to which an integrated design solution...... satisfies the design requirements and objectives. In the design of sustainable Buildings it is beneficial to identify the most important design parameters in order to develop more efficiently alternative design solutions or reach optimized design solutions. A sensitivity analysis makes it possible...... to identify the most important parameters in relation to building performance and to focus design and optimization of sustainable buildings on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage of the building design process, where...

  15. Applying importance-performance analysis to evaluate banking service quality

    Directory of Open Access Journals (Sweden)

    André Luís Policani Freitas

    2012-11-01

    Full Text Available In an increasingly competitive market, the identification of the most important aspects and the measurement of service quality as perceived by the customers are important actions taken by organizations which seek the competitive advantage. In particular, this scenario is typical of Brazilian banking sector. In this context, this article presents an exploratory case study in which the Importance-Performance Analysis (IPA was used to identify the strong and the weak points related to services provided by a bank. In order to check the reliability of the questionnaire, Cronbach's alpha and correlation analyses were used. The results are presented and some actions have been defined in order to improve the quality of services.

  16. Applying Hybrid-Quantity Analysis in the Asia Semiconductor Industry

    Directory of Open Access Journals (Sweden)

    Chin-Yuan Fan

    2013-08-01

    Full Text Available The semiconductor market has gradually transitioned from advanced countries to the Asian-Pacific region. Since the 1980s, Taiwan has been developing its own semiconductor industry, and after 20 years of effort, has become one of the world's major exporters of semiconductor products. Therefore, to position Taiwan in relation to other countries for competitive advantage, as defined by technology and industrial development, requires a better understanding of the developmental trends of the semiconductor technology of major competing countries in the Asian-Pacific region. This can further provide the Taiwanese government with additional strategic development proposals. We used a combination of patents, data-mining methods [multidimensional scaling (MDS analysis, and K-means clustering] to explore competing technological and strategic-group relationships within the semiconductor industry in the Asian-Pacific region. We assessed the relative technological advantages of various organizations and proposed additional technology development strategy recommendations to the Taiwanese semiconductor industry.

  17. Testing geostatistical methods to combine radar and rain gauges for precipitation mapping in a mountainous region

    Science.gov (United States)

    Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.

    2010-09-01

    There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at

  18. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  19. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    Science.gov (United States)

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (pzirconia polycrystalline ceramics. PMID:26722988

  20. Ion Beam Analysis applied to laser-generated plasmas

    Science.gov (United States)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  1. Q uantitative methods applied in the analysis of teenagers problems

    Directory of Open Access Journals (Sweden)

    Constanţa Popescu

    2015-12-01

    Full Text Available The theme of the article is the study of teenagers problems based on quantitative methods, the scientific approach is divided into two parts: the part of knowledge and the practical approach. During the first part we describe the problems of adolescents based on national and international literature, and during the second part we use some quantitative methods (diagnosis, regression and investigation we aim to achieve an in-depth analysis of the addressed topic. Through the diagnosis we highlight changes in numerical terms of the number of adolescents, and also their problems: poverty and delinquency. Regression functions are used to show the nature, direction and intensity of the relationship between a number of causal variables and the outcome variable. The investigation aims to identify the extent to which cultural values of the country leave their mark on the perception of the importance of family and friends for teens. The main conclusions of the research points out the fact that the decrease in the number of Romanian teenagers their problems still persist.

  2. Ion Beam Analysis applied to laser-generated plasmas

    International Nuclear Information System (INIS)

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed

  3. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  4. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  5. Identifying spatial heterogeneity of coal resource quality in a multilayer coal deposit by multivariate geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Heriawan, Mohamad Nur [Graduate School of Science and Technology, Kumamoto University, Kurokami 2-39-1, Kumamoto 860-8555 (Japan); Earth Resources Exploration Research Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung 40132 (Indonesia); Koike, Katsuaki [Department of Life and Environmental Science, Graduate School of Science and Technology, Kumamoto University, Kurokami 2-39-1, Kumamoto 860-8555 (Japan)

    2008-02-01

    This study presents a new geostatistical approach to characterization of the geometry and quality of a multilayer coal deposit using the data of seam thickness as a geometric property and the contents of ash, sodium, total sulphur, and the heating value as quality properties. A coal deposit in East Kalimantan (Borneo), Indonesia, which has a synclinal geological structure, was chosen as the study site. Semivariogram analysis clarified the strong dependence of heating value on ash content in the top and bottom parts of each seam and the existence of a strong correlation with sodium content over the sub-seams in the same location. The correlations between the geometry and quality of the seams were generally weak. A linear coregionalization model was used to derive the spatial correlation coefficients of two variables at each scale component from the single- and cross-semivariogram matrices. Because the data were correlated spatially in the same seam or over different seams, multivariate techniques (ordinary cokriging and factorial cokriging) were mainly used and the resultant spatial estimates were compared to those derived using a univariate technique (ordinary kriging). A factorial cokriging was effective to decompose the spatial correlation structures with different scales. Another important characteristic was that the sodium content shows distinct segregation: the low zones are concentrated near the boundary of the sedimentary basin, while the high zones are concentrated in the central part. The main component of sodium originates from the abundance of saline water. Therefore, it can be inferred that seawater had stronger effects on the coal depositional process in the central basin than in the border part. The geostatistical modeling results suggest that the thicknesses of all the major seams were controlled by the syncline structure, while the coal qualities chiefly were originated from the coal depositional and diagenetic processes. (author)

  6. Application of geostatistical inversion to thin reservoir prediction%地质统计学反演技术在薄储层预测中的应用

    Institute of Scientific and Technical Information of China (English)

    王香文; 刘红; 滕彬彬; 王连雨

    2012-01-01

    Taking Ml thin reservoir in H-N oilfield,Southern Ecuador,as an example,this paper documents the challenges and problems of thin reservoir prediction and presents relevant techniques and methods to tackle these problems. Based on analysis of geophysical characteristics of reservoirs and surrounding rocks,a geostatistical inversion technique is applied in this case to identify the thin(l -25ft) reservoirs with rapid lateral changes and strong concealment. Sand distribution is refined through correlation between different data volume including seismic interpretation, CSSI( Constrained Sparse Spike Inversion) and geostatistical inversion,and is further checked by non-well, random-wells and newly drilled wells. The accuracy of thin reservoir prediction is greatly enhanced to a vertical resolution up to 5ft. This technique is successfully applied in H-N oilfield and the new drilling data show that all the prediceted thin sand layers are encountered and the drilling coincidence rate is 82%.%以厄瓜多尔南部H-N油田M1层薄储层为例,阐述了研究区M1层储层预测难点和存在问题,提出针对性的储层预测方法技术.经过储层和围岩地球物理特征分析,论证了储层预测条件,确定了运用以地质统计学反演为核心的储层预测技术对该区进行储层预测研究,来解决该区储层薄(1 ~25 ft)、横向变化大、隐蔽性强的薄储层的识别;通过以地震、稀疏脉冲反演、地质统计学反演不同数据体间砂体进行对比分析,精细解释出该区砂体分布;经过无井、盲井和新钻井校验,实现了薄层的高精度预测,提高了预测精度(垂向分辨率达到5ft).该预测结果经过在H-N油田的实际应用和新钻井钻探证实,砂层钻遇率为100%,钻探符合率达82%,实现了该区新井产能的突破.

  7. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    Science.gov (United States)

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  8. Aplicação de métodos geoestatísticos para identificação de dependência espacial na análise de dados de um ensaio de espaçamento florestal em delineamento sistemático tipo leque Application of geostatistical methods to identify spatial dependence in the data analysis of a forest spacing experiment with a fan systematic design

    Directory of Open Access Journals (Sweden)

    Melissa Oda-Souza

    2008-06-01

    arrangement (non-randomized of the plants and the high sensibility for missing values. The aim of this work was to describe the geostatistic model and associated methods of inference in the analysis context of non-randomized experiment, reporting applied results to identify the spatial dependence in a fan systematic design of Eucalyptus dunnii. Furthermore, different alternatives for treating missing values that can occur from flaws and/or mortality of plants were proposed, analyzed and compared. Data were analyzed by three models that differed, with covariates, in the form of modeling missing data values. A semivariogram was built for each model, adjusting three correlation function models, being the parameters estimated through the maximum likelihood method and selected by the Akaike's criterion. These models, with and without the spatial component, were compared by the likelihood ratio test. The results showed that: (1 the covariates interacted positively with the response variable, avoiding data to be discarded; (2 the model comparison, with and without the spatial component, did not confirm the existence of dependence; (3 the incorporation of the spatial dependence structure into the observational models recovered the capacity to make valid inferences in the absence of randomization, overcoming operational problems and guaranteeing that the data can be subjected to classic analysis.

  9. Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data

    Science.gov (United States)

    Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.

    2015-12-01

    For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.

  10. High-performance computational and geostatistical experiments for testing the capabilities of 3-d electrical tomography

    Energy Technology Data Exchange (ETDEWEB)

    Carle, S. F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Daily, W. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Newmark, R. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ramirez, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Tompson, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    1999-01-19

    This project explores the feasibility of combining geologic insight, geostatistics, and high-performance computing to analyze the capabilities of 3-D electrical resistance tomography (ERT). Geostatistical methods are used to characterize the spatial variability of geologic facies that control sub-surface variability of permeability and electrical resistivity Synthetic ERT data sets are generated from geostatistical realizations of alluvial facies architecture. The synthetic data sets enable comparison of the "truth" to inversion results, quantification of the ability to detect particular facies at particular locations, and sensitivity studies on inversion parameters

  11. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  12. Bayesian geostatistics in health cartography: the perspective of malaria.

    Science.gov (United States)

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  13. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg;

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... using the global information. Then methods for choosing a proper sampling area for a single sample of dust on a table are given. The global contamination of an object is determined by a maximum likelihood estimator. Finally, it is shown how specified experimental goals can be included to determine...

  14. Principal Component Geostatistical Approach for large-dimensional inverse problems

    Science.gov (United States)

    Kitanidis, P. K.; Lee, J.

    2014-07-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best.

  15. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  16. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    Science.gov (United States)

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  17. Geographical distribution of the annual mean radon concentrations in primary schools of Southern Serbia – application of geostatistical methods

    International Nuclear Information System (INIS)

    Between 2008 and 2011 a survey of radon (222Rn) was performed in schools of several districts of Southern Serbia. Some results have been published previously (Žunić et al., 2010; Carpentieri et al., 2011; Žunić et al., 2013). This article concentrates on the geographical distribution of the measured Rn concentrations. Applying geostatistical methods we generate “school radon maps” of expected concentrations and of estimated probabilities that a concentration threshold is exceeded. The resulting maps show a clearly structured spatial pattern which appears related to the geological background. In particular in areas with vulcanite and granitoid rocks, elevated radon (Rn) concentrations can be expected. The “school radon map” can therefore be considered as proxy to a map of the geogenic radon potential, and allows identification of radon-prone areas, i.e. areas in which higher Rn radon concentrations can be expected for natural reasons. It must be stressed that the “radon hazard”, or potential risk, estimated this way, has to be distinguished from the actual radon risk, which is a function of exposure. This in turn may require (depending on the target variable which is supposed to measure risk) considering demographic and sociological reality, i.e. population density, distribution of building styles and living habits. -- Highlights: • A map of Rn concentrations in primary schools of Southern Serbia. • Application of geostatistical methods. • Correlation with geology found. • Can serve as proxy to identify radon prone areas

  18. Geostatistic models conditioning to production data; Condicionamento de modelos geoestatisticos a dados de producao

    Energy Technology Data Exchange (ETDEWEB)

    Cunha, Luciane B.; Rodrigues, Jose Roberto P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas

    2000-07-01

    Geostatistical modeling of reservoir heterogeneity is now widely used by geologists/engineers to fill in reservoir simulation grids. These geostatistical models are made of facies or geologic objects and are built by simulation algorithms that reflect the main statistics of the geology of deposits. Integration of dynamic data together with the geology enhances the quality of the geostatistical modeling and provides the reservoir engineers with a better basis for reservoir simulation and management. The uncertainty of simulated production scenarios is then reduced, allowing more realistic economic evaluation. The present paper deals with one of the most often encountered methodology in the literature to incorporate production data in geostatistical models of reservoir heterogeneities. Both, the limitations, and potential benefits of this method are highlighted. At the end, some examples are discussed. (author)

  19. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida Vedel;

    2012-01-01

    . Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty......, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box...... flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes...

  20. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John;

    of both concentration and groundwater flow. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across...... a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners...... compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, ii) measurement uncertainty, and iii...

  1. Geostatistical methods for the integrated information; Metodos geoestadisticos para la integracion de informacion

    Energy Technology Data Exchange (ETDEWEB)

    Cassiraga, E.F.; Gomez-Hernandez, J.J. [Departamento de Ingenieria Hidraulica y Medio Ambiente, Universidad Politecnica de Valencia, Valencia (Spain)

    1996-10-01

    The main objective of this report is to describe the different geostatistical techniques to use the geophysical and hydrological parameters. We analyze the characteristics of estimation methods used in others studies.

  2. Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density EEG

    CERN Document Server

    Wingeier, B M; Silberstein, R B; Wingeier, Brett M.; Nunez, Paul L.; Silberstein, Richard B.

    2001-01-01

    We demonstrate an application of spherical harmonic decomposition to analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to analysis of hemispherical, irregularly sampled data. Performance of the methods and spatial sampling requirements are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wavenumber relationship in some bands.

  3. Bayesian geostatistical prediction of the intensity of infection with Schistosoma mansoni in East Africa

    OpenAIRE

    Archie C.A. Clements; MOYEED, RANA; Brooker, Simon

    2006-01-01

    A Bayesian geostatistical model was developed to predict the intensity of infection with Schistosoma mansoni in East Africa. Epidemiological data from purposively-designed and standardized surveys were available for 31,458 schoolchildren (90% aged between 6-16 years) from 459 locations across the region and used in combination with remote sensing environmental data to identify factors associated with spatial variation in infection patterns. The geostatistical model explicitly takes into accou...

  4. Statistical tool for soil biology : X. Geostatistical analysis

    OpenAIRE

    Rossi, Jean-Pierre; Lavelle, P.; Tondoh, J.E.

    1995-01-01

    Les organismes du sol présentent généralement des patrons de distribution spatiale à diverses échelles. Les méthodes classiques d'étude de la distribution spatiale sont basées sur divers indices d'agrégation ainsi que sur l'analyse des distributions de fréquence. Ces méthodes ne prennent pas en considération la position des points d'échantillonnage et par conséquent n'apportent pas d'information sur la distribution spatiale des organismes aux échelles supérieures à l'unité d'échantillonnage. ...

  5. Geostatistical Soil Data Analysis II. Optimal interpolation with kriging

    OpenAIRE

    Boško Miloš

    2001-01-01

    The optimal interpolation by using the technique of ordinary kriging, based on the regionalised variable theory, is described and illustrated by a case study of the top-soil (depth 0-30cm) CaCO3 and humus content in the 136 pedons at Petrovo polje, located in dalmatinska Zagora. The spatial variability of the CaCO3 and humus content shown by threedimensional diagrams of the kriged estimates and associated standard errors diagrams may be consequence of different geology, topography and hidrolo...

  6. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  7. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

    Directory of Open Access Journals (Sweden)

    Larisa Preda

    2007-05-01

    Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

  8. Should hydraulic tomography data be interpreted using geostatistical inverse modeling? A laboratory sandbox investigation

    Science.gov (United States)

    Illman, Walter A.; Berg, Steven J.; Zhao, Zhanfeng

    2015-05-01

    The robust performance of hydraulic tomography (HT) based on geostatistics has been demonstrated through numerous synthetic, laboratory, and field studies. While geostatistical inverse methods offer many advantages, one key disadvantage is its highly parameterized nature, which renders it computationally intensive for large-scale problems. Another issue is that geostatistics-based HT may produce overly smooth images of subsurface heterogeneity when there are few monitoring interval data. Therefore, some may question the utility of the geostatistical inversion approach in certain situations and seek alternative approaches. To investigate these issues, we simultaneously calibrated different groundwater models with varying subsurface conceptualizations and parameter resolutions using a laboratory sandbox aquifer. The compared models included: (1) isotropic and anisotropic effective parameter models; (2) a heterogeneous model that faithfully represents the geological features; and (3) a heterogeneous model based on geostatistical inverse modeling. The performance of these models was assessed by quantitatively examining the results from model calibration and validation. Calibration data consisted of steady state drawdown data from eight pumping tests and validation data consisted of data from 16 separate pumping tests not used in the calibration effort. Results revealed that the geostatistical inversion approach performed the best among the approaches compared, although the geological model that faithfully represented stratigraphy came a close second. In addition, when the number of pumping tests available for inverse modeling was small, the geological modeling approach yielded more robust validation results. This suggests that better knowledge of stratigraphy obtained via geophysics or other means may contribute to improved results for HT.

  9. Applying static code analysis to firewall policies for the purpose of anomaly detection

    OpenAIRE

    Zaliva, Vadim

    2011-01-01

    Treating modern firewall policy languages as imperative, special purpose programming languages, in this article we will try to apply static code analysis techniques for the purpose of anomaly detection. We will first abstract a policy in common firewall policy language into an intermediate language, and then we will try to apply anomaly detection algorithms to it. The contributions made by this work are: 1. An analysis of various control flow instructions in popular firewall policy languages ...

  10. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  11. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kolotilina, L.; Nikishin, A.; Yeremin, A. [and others

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  12. Assimilation of Satellite Soil Moisture observation with the Particle Filter-Markov Chain Monte Carlo and Geostatistical Modeling

    Science.gov (United States)

    Moradkhani, Hamid; Yan, Hongxiang

    2016-04-01

    Soil moisture simulation and prediction are increasingly used to characterize agricultural droughts but the process suffers from data scarcity and quality. The satellite soil moisture observations could be used to improve model predictions with data assimilation. Remote sensing products, however, are typically discontinuous in spatial-temporal coverages; while simulated soil moisture products are potentially biased due to the errors in forcing data, parameters, and deficiencies of model physics. This study attempts to provide a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a fully distributed hydrologic model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. A geostatistical model is introduced to overcome the satellite soil moisture discontinuity issue where satellite data does not cover the whole study region or is significantly biased, and the dominant land cover is dense vegetation. The results indicate that joint assimilation of soil moisture and streamflow has minimal effect in improving the streamflow prediction, however, the surface soil moisture field is significantly improved. The combination of DA and geostatistical approach can further improve the surface soil moisture prediction.

  13. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  14. The Geostatistical Framework for Spatial Prediction%空间预测的地统计学框架

    Institute of Scientific and Technical Information of China (English)

    张景雄; 姚娜

    2008-01-01

    Geostatistics provides a coherent framework for spatial prediction and uncertainty assessment, whereby spatial dependence, as quantified by variograms, is utilized for best linear unbiased estimation of a regionalized variable at unsampied locations. Geostatistics for prediction of continuous regionalized variables is reviewed, with key methods underlying the derivation of major variants of uni-variate Kriging described in an easy-to-follow manner. This paper will contribute to demystification and, hence, popularization of geostatistics in geoinformatics communities.

  15. Large to intermediate-scale aquifer heterogeneity in fine-grain dominated alluvial fans (Cenozoic As Pontes Basin, northwestern Spain): insight based on three-dimensional geostatistical reconstruction

    Science.gov (United States)

    Falivene, O.; Cabrera, L.; Sáez, A.

    2007-08-01

    Facies reconstructions are used in hydrogeology to improve the interpretation of aquifer permeability distribution. In the absence of sufficient data to define the heterogeneity due to geological processes, uncertainties in the distribution of aquifer hydrofacies and characteristics may appear. Geometric and geostatistical methods are used to understand and model aquifer hydrofacies distribution, providing models to improve comprehension and development of aquifers. However, these models require some input statistical parameters that can be difficult to infer from the study site. A three-dimensional reconstruction of a kilometer scale fine-grain dominated Cenozoic alluvial fan derived from more than 200 continuously cored, closely spaced, and regularly distributed wells is presented. The facies distributions were reconstructed using a genetic stratigraphic subdivision and a deterministic geostatistical algorithm. The reconstruction is only slightly affected by variations in the geostatistical input parameters because of the high-density data set. Analysis of the reconstruction allowed identification in the proximal to medial alluvial fan zones of several laterally extensive sand bodies with relatively higher permeability; these sand bodies were quantified in terms of volume, mean thickness, maximum area, and maximum equivalent diameter. These quantifications provide trends and geological scenarios for input statistical parameters to model aquifer systems in similar alluvial fan depositional settings.

  16. A geostatistical approach to data harmonization - Application to radioactivity exposure data

    Science.gov (United States)

    Baume, O.; Skøien, J. O.; Heuvelink, G. B. M.; Pebesma, E. J.; Melles, S. J.

    2011-06-01

    Environmental issues such as air, groundwater pollution and climate change are frequently studied at spatial scales that cross boundaries between political and administrative regions. It is common for different administrations to employ different data collection methods. If these differences are not taken into account in spatial interpolation procedures then biases may appear and cause unrealistic results. The resulting maps may show misleading patterns and lead to wrong interpretations. Also, errors will propagate when these maps are used as input to environmental process models. In this paper we present and apply a geostatistical model that generalizes the universal kriging model such that it can handle heterogeneous data sources. The associated best linear unbiased estimation and prediction (BLUE and BLUP) equations are presented and it is shown that these lead to harmonized maps from which estimated biases are removed. The methodology is illustrated with an example of country bias removal in a radioactivity exposure assessment for four European countries. The application also addresses multicollinearity problems in data harmonization, which arise when both artificial bias factors and natural drifts are present and cannot easily be distinguished. Solutions for handling multicollinearity are suggested and directions for further investigations proposed.

  17. Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)

    Science.gov (United States)

    Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys

    2016-02-01

    Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).

  18. Study on the spatial pattern of rainfall erosivity based on geostatistics in Hebei Province,China

    Institute of Scientific and Technical Information of China (English)

    Mingxin MEN; Zhenrong YU; Hao XU

    2008-01-01

    The objective of this article was to study the spatial distribution pattern of rainfall erosivity.The precipitation data at each climatological station in Hebei Province,China were collected and analyzed and modeled with SPSS and ArcGIS.A simple model of estimating rainfall erosivity was developed based on the weather station data.Also,the annual average rainfall erosivity was calculated with this model.The predicted errors,statistical feature values and prediction maps obtained by using different interpolation methods were compared.The result indicated that second-order ordinary Kriging method performed better than both zero and first-order ordinary Kriging methods.Within the method of second-order trend,Gaussian semi-variogram model performed better than other interpolation methods with the spherical or exponential models.Applying geostatistics to study rainfall erosivity spatial pattern will help to accurately and quantitatively evaluate soil erosion risk.Our research also provides digital maps that can assist in policy making in the regional soil and water conservation planning and management strategies.

  19. Spatial variability of selected physicochemical parameters within peat deposits in small valley mire: a geostatistical approach

    Directory of Open Access Journals (Sweden)

    Pawłowski Dominik

    2014-12-01

    Full Text Available Geostatistical methods for 2D and 3D modelling spatial variability of selected physicochemical properties of biogenic sediments were applied to a small valley mire in order to identify the processes that lead to the formation of various types of peat. A sequential Gaussian simulation was performed to reproduce the statistical distribution of the input data (pH and organic matter and their semivariances, as well as to honouring of data values, yielding more ‘realistic’ models that show microscale spatial variability, despite the fact that the input sample cores were sparsely distributed in the X-Y space of the study area. The stratigraphy of peat deposits in the Ldzań mire shows a record of long-term evolution of water conditions, which is associated with the variability in water supply over time. Ldzań is a fen (a rheotrophic mire with a through-flow of groundwater. Additionally, the vicinity of the Grabia River is marked by seasonal inundations of the southwest part of the mire and increased participation of mineral matter in the peat. In turn, the upper peat layers of some of the central part of Ldzań mire are rather spongy, and these peat-forming phytocoenoses probably formed during permanent waterlogging.

  20. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    Science.gov (United States)

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  1. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  2. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    Science.gov (United States)

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  3. Critical Analysis of a Website: A Critique based on Critical Applied Linguistics and Critical Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rina Agustina

    2013-05-01

    Full Text Available E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1 critical applied linguistics  (CAL aspects and (2 critical  discourse analysis (CDA. After analysing the website using CAL and CDA, it was found that the website was adequate for beginner, in which it provided fun learning through games as well as challenged learners’ to test their vocabulary. Despite of these strengths, there were several issues required further thinking in terms of learners’ broad knowledge, such as, some of learning materials focused on states in America. It was quite difficult for EFL learners if they did not have adequate general knowledge. Thus, the findings implied that the website could be used as a supporting learning material, which accompanied textbooks and vocabulary exercise books.

  4. Soil Organic Carbon Mapping by Geostatistics in Europe Scale

    Science.gov (United States)

    Aksoy, E.; Panagos, P.; Montanarella, L.

    2013-12-01

    Accuracy in assessing the distribution of soil organic carbon (SOC) is an important issue because SOC is an important soil component that plays key roles in the functions of both natural ecosystems and agricultural systems. The SOC content varies from place to place and it is strongly related with climate variables (temperature and rainfall), terrain features, soil texture, parent material, vegetation, land-use types, and human management (management and degradation) at different spatial scales. Geostatistical techniques allow for the prediction of soil properties using soil information and environmental covariates. In this study, assessment of SOC distribution has been predicted with Regression-Kriging method in Europe scale. In this prediction, combination of the soil samples which were collected from the LUCAS (European Land Use/Cover Area frame statistical Survey) & BioSoil Projects, with local soil data which were collected from six different CZOs in Europe and ten spatial predictors (slope, aspect, elevation, CTI, CORINE land-cover classification, parent material, texture, WRB soil classification, annual average temperature and precipitation) were used. Significant correlation between the covariates and the organic carbon dependent variable was found. Moreover, investigating the contribution of local dataset in watershed scale into regional dataset in European scale was an important challenge.

  5. Geostatistical interpolation for modelling SPT data in northern Izmir

    Indian Academy of Sciences (India)

    Selim Altun; A Burak Göktepe; Alper Sezer

    2013-12-01

    In this study, it was aimed to map the corrected Standard Penetration Test(SPT) values in Karşıyaka city center by kriging approach. Six maps were prepared by this geostatistical approach at depths of 3, 6, 9, 13.5, 18 and 25.5m. Borehole test results obtained from 388 boreholes in central Karşıyaka were used to model the spatial variation of $(\\text{N}_1)_{\\text{60cs}}$ values in an area of 5.5 km2. Corrections were made for depth, hammer energy, rod length, sampler, borehole diameter and fines content, to the data in hand. At various depths, prepared variograms and the kriging method were used together to model the variation of corrected SPT data in the region, which enabled the estimation of missing data in the region. The results revealed that the estimation ability of the models were acceptable, which were validated by a number of parameters as well as the comparisons of the actual and estimated data. Outcomes of this study can be used in microzonation studies, site response analyses, calculation of bearing capacity of subsoils in the region and producing a number of parameters which are empirically related to corrected SPT number as well.

  6. 利用 GIS地理统计模块预测海南岛植被指数季节性变化趋势%Prediction of the Seasonal Change Trend of NDVI in Hainan Island by GIS Geostatistical Analysis

    Institute of Scientific and Technical Information of China (English)

    刘少军; 黄彦彬; 张京红; 李天富; 陈汇林; 陈德明

    2006-01-01

    虽然采用遥感图像提取的植被指数在空间上能较好的反映作物的状况,但其不能预测植被指数在空间上的变化范围,如果能从整体上了解不同市县在不同季节的平均植被指数值,就可以对该区域整体植被状态进行量化分析,也就可以从大范围内进行植被指数的预测分析.利用地理信息系统(GIS)和地统计学相结合的地理统计分析模块(ArcGIS Geostatistical Analyst),根据MODIS遥感数据提取的每季度不同市县平均NDVI植被指数,采用Kriging插值的方法分析了海南岛归一化植被指数(NDVI)季节性变化趋势,并与实际采样值进行对比分析,结果表明,利用ArcGIS Geostatistical Analyst中的Kriging插值方法能较好地预测植被指数的空间分布范围.

  7. 岩石物理参数高分辨率地质统计学反演%High-resolution geostatistical petrophysical-parameter inversion

    Institute of Scientific and Technical Information of China (English)

    姜文龙; 杨锴

    2012-01-01

    Geostatistical inversion can well characterize thin-layers for its high-resolution. We discussed the relationship between geostatistical inversion and high-resolution, as well as the problem of geostatistical inversion in petrophysical parameter inversion. Moreover, the algorithm for reducing the uncertainty of inversion was studied. Research results show that along with the alternation of variogram, the resolution of geostatistical inversion result will change, but the conventional Krigging algorithm destroy the continuity of original geologic formations when improving resolution through reducing variogram. Based on the above results, we introduced some restraints such as geologic interpretation strata and dip into geostatistical inversion. The method was applied on the inversion of carbonate mineral components at ODP1144 station sea area in South Sea.%地质统计学反演由于其高分辨率的特点,可以很好地用来描述薄层等信息.就地质统计学反演与高分辨率的关系和地质统计学反演在岩石物理参数反演中存在的问题进行了讨论,并从算法上研究了减小反演不确定性的方法.研究结果表明,随着变差函数变程的改变,地质统计学模拟结果的分辨率也会发生改变,但常规的克里金算法在通过减小变程来提高分辨率的同时,破坏了原有地质层位的连续性.在此基础上提出加入地质解释层位和地层倾角等约束信息的地质统计学反演方法,将该方法应用于南海ODP1144站位海区矿物组分的反演,很好地揭示了该区碳酸盐矿物的沉积特征.

  8. Trends in applied econometrics software development 1985-2008, an analysis of Journal of Applied Econometrics research articles, software reviews, data and code

    OpenAIRE

    Ooms, M.

    2008-01-01

    Trends in software development for applied econometrics emerge from an analysis of the research articles and software reviews of the Journal of Applied Econometrics, appearing since 1986. The data and code archive of the journal provides more specific information on software use for applied econometrics since 1995. GAUSS, Stata, MATLAB and Ox have been the most important softwares after 2001. I compare these higher level programming languages and R in somewhat more detail. An increasing numbe...

  9. Characterisation of contaminated metals using an advanced statistical toolbox - Geostatistical characterisation of contaminated metals: methodology and illustrations

    International Nuclear Information System (INIS)

    Radiological characterisation plays an important role in the process to recycle contaminated or potentially contaminated metals. It is a platform for planning, identification of the extent and nature of contamination, assessing potential risk impacts, cost estimation, radiation protection, management of material arising from decommissioning as well as for the release of the materials as well as the disposal of the generated secondary waste as radioactive waste. Key issues in radiological characterisation are identification of objectives, development of a measurement and sampling strategy (probabilistic, judgmental or a combination thereof), knowledge management, traceability, recording and processing of obtained information. By applying advanced combination of statistical and geostatistical in the concept better performance can be achieved at a lower cost. This paper will describe the benefits with the usage of the available methods in the different stages of the characterisation, treatment and clearance processes aiming for reliable results in line with the data quality objectives. (authors)

  10. The detection of thermophilous forest hotspots in Poland using geostatistical interpolation of plant richness

    Directory of Open Access Journals (Sweden)

    Marcin Kiedrzyński

    2014-07-01

    Full Text Available Attempts to study biodiversity hotspots on a regional scale should combine compositional and functionalist criteria. The detection of hotspots in this study uses one ecologically similar group of high conservation value species as hotspot indicators, as well as focal habitat indicators, to detect the distribution of suitable environmental conditions. The method is assessed with reference to thermophilous forests in Poland – key habitats for many rare and relict species. Twenty-six high conservation priority species were used as hotspot indicators, and ten plant taxa characteristic of the Quercetalia pubescenti-petraeae phytosociological order were used as focal habitat indicators. Species distribution data was based on a 10 × 10 km grid. The number of species per grid square was interpolated by the ordinary kriging geostatistical method. Our analysis largely determined the distribution of areas with concentration of thermophilous forest flora, but also regional disjunctions and geographical barriers. Indicator species richness can be interpreted as a reflection of the actual state of habitat conditions. It can also be used to determine the location of potential species refugia and possible past and future migration routes.

  11. Geostatistical inspired metamodeling and optimization of nanoscale analog circuits

    Science.gov (United States)

    Okobiah, Oghenekarho

    The current trend towards miniaturization of modern consumer electronic devices significantly affects their design. The demand for efficient all-in-one appliances leads to smaller, yet more complex and powerful nanoelectronic devices. The increasing complexity in the design of such nanoscale Analog/Mixed-Signal Systems-on-Chip (AMS-SoCs) presents difficult challenges to designers. One promising design method used to mitigate the burden of this design effort is the use of metamodeling (surrogate) modeling techniques. Their use significantly reduces the time for computer simulation and design space exploration and optimization. This dissertation addresses several issues of metamodeling based nanoelectronic based AMS design exploration. A surrogate modeling technique which uses geostatistical based Kriging prediction methods in creating metamodels is proposed. Kriging prediction techniques take into account the correlation effects between input parameters for performance point prediction. We propose the use of Kriging to utilize this property for the accurate modeling of process variation effects of designs in the deep nanometer region. Different Kriging methods have been explored for this work such as simple and ordinary Kriging. We also propose another metamodeling technique Kriging-Bootstrapped Neural Network that combines the accuracy and process variation awareness of Kriging with artificial neural network models for ultra-fast and accurate process aware metamodeling design. The proposed methodologies combine Kriging metamodels with selected algorithms for ultra-fast layout optimization. The selected algorithms explored are: Gravitational Search Algorithm (GSA), Simulated Annealing Optimization (SAO), and Ant Colony Optimization (ACO). Experimental results demonstrate that the proposed Kriging metamodel based methodologies can perform the optimizations with minimal computational burden compared to traditional (SPICE-based) design flows.

  12. Applying Multiscale Entropy to the Complexity Analysis of Rainfall-Runoff Relationships

    OpenAIRE

    Chien-Ming Chou

    2012-01-01

    This paper presents a novel framework for the complexity analysis of rainfall, runoff, and runoff coefficient (RC) time series using multiscale entropy (MSE). The MSE analysis of RC time series was used to investigate changes in the complexity of rainfall-runoff processes due to human activities. Firstly, a coarse graining process was applied to a time series. The sample entropy was then computed for each coarse-grained time series, and plotted as a function of the scale factor. The proposed ...

  13. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    OpenAIRE

    Clayton, J. D.

    2016-01-01

    Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceram...

  14. Does size matter? Separations on guard columns for fast sample analysis applied to bioenergy research

    OpenAIRE

    Bauer, Stefan; Ibanez, Ana B

    2015-01-01

    Background Increasing sample throughput is needed when large numbers of samples have to be processed. In chromatography, one strategy is to reduce column length for decreased analysis time. Therefore, the feasibility of analyzing samples simply on a guard column was explored using refractive index and ultraviolet detection. Results from the guard columns were compared to the analyses using the standard 300 mm Aminex HPX-87H column which is widely applied to the analysis of samples from many b...

  15. Structural Integrity Analysis of the RBMK Reactor Critical Structures Applying Probabilistic Methods

    OpenAIRE

    Dundulis, Gintautas; Kulak, Ronald; Alzbutas, Robertas; Uspuras, Eugenijus

    2010-01-01

    The probability-based approach that integrates deterministic and probabilistic methods was developed to analyse failures of NPP buildings and components. This methodology was applied to safety analysis of the Ignalina NPP. The application of this methodology to two postulated accidents―pipe whip impact and aircraft crash―is presented in this chapter. The NEPTUNE software system was used for the deterministic transient analysis of the pipe whip impact and aircraft crash accidents. Many det...

  16. Geostatistical evaluation of integrated marsh management impact on mosquito vectors using before-after-control-impact (BACI design

    Directory of Open Access Journals (Sweden)

    Dempsey Mary E

    2009-06-01

    areas led to a significant decrease (~44% in the number of times when the larviciding threshold was reached. This reduction, in turn, resulted in a significant decrease (~74% in the number of larvicide applications in the treatment areas post-project. The remaining larval habitat in the treatment areas had a different geographic distribution and was largely confined to the restored marsh surface (i.e. filled-in mosquito ditches; however only ~21% of the restored marsh surface supported mosquito production. Conclusion The geostatistical analysis showed that OMWM demonstrated considerable potential for effective mosquito control and compatibility with other natural resource management goals such as restoration, wildlife habitat enhancement, and invasive species abatement. GPS and GIS tools are invaluable for large scale project design, data collection, and data analysis, with geostatistical methods serving as an alternative or a supplement to the conventional inference statistics in evaluating the project outcome.

  17. Analysis and Reconstitution on Talent Cultivating Objective Positioning for University of Applied Technology

    Institute of Scientific and Technical Information of China (English)

    成伟伟

    2016-01-01

    As an emerging and multidisciplinary college, University of Applied Technology features the certain kind of applied technology, and the technical talents from the university like this have drew great attention from society. However, how to positioning the talent cultivating objective is very critical to balance the mismatching between talents supply of universities and talents demand of the society. This article focuses on the analysis of existing issue that lays in talent cultivating objective positioning for transitioning from academic universities to technology applied universities for local institutes, and clarification of the characteristics of talent training objective positioning in western developed countries as well. Given this, put forward how domestic universities of applied technology locate and reconstitute talent cultivating objectives.

  18. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    Science.gov (United States)

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  19. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    structure evaluation by assessing the local identifiability characteristics of the parameters. Moreover, such a procedure should be generic to make sure it can be applied independent from the structure of the model. We hereby apply a numerical identifiability approach which is based on the work of Walter...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring....... In contrast, the practical identifiability analysis revealed that high values of the forward rate parameter Vf led to identifiability problems. These problems were even more pronounced athigher substrate concentrations, which illustrates the importance of a proper experimental designto avoid...

  20. TiConverter: A training image converting tool for multiple-point geostatistics

    Science.gov (United States)

    Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael

    2016-11-01

    TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.

  1. Geostatistical Prediction of Ocean Outfall Plume Characteristics Based on an Autonomous Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    Patrícia Alexandra Gregório Ramos

    2013-07-01

    Full Text Available Geostatistics has been successfully used to analyse and characterize the spatial variability of environmental properties. Besides providing estimated values at unsampled locations, geostatistics measures the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. This work uses universal block kriging to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign. The aim is to distinguish the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents, which are valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies may be very helpful in the future to validate dispersion models.

  2. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    Science.gov (United States)

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  3. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    Science.gov (United States)

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  4. Applied Behaviour Analysis. It Works, It's Positive; Now What's the Problem?

    Science.gov (United States)

    Kerr, Ken P.; Mulhern, F.; McDowell, C.

    2000-01-01

    Describes key findings concerning the effectiveness of applied behavior analysis (ABA) for children with autism. Discusses obstacles present in Ireland to treating children with autism using ABA techniques. Describes the work of Parents' Education as Autism Therapists and the Irish Children's Autism Network for Developmental Opportunities to…

  5. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    Science.gov (United States)

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  6. Applied Behaviour Analysis: Does Intervention Intensity Relate to Family Stressors and Maternal Well-Being?

    Science.gov (United States)

    Schwichtenberg, A.; Poehlmann, J.

    2007-01-01

    Background: Interventions based on applied behaviour analysis (ABA) are commonly recommended for children with an autism spectrum disorder (ASD); however, few studies address how this intervention model impacts families. The intense requirements that ABA programmes place on children and families are often cited as a critique of the programme,…

  7. Structure analysis of interstellar clouds - II. Applying the Delta-variance method to interstellar turbulence

    NARCIS (Netherlands)

    Ossenkopf, V.; Krips, M.; Stutzki, J.

    2008-01-01

    Context. The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. It has been applied both to simulations of interstellar turbulence and to observed molecular cloud maps. In Paper I we proposed essential improvem

  8. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    Science.gov (United States)

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  9. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    Science.gov (United States)

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  10. Reliability analysis method applied in slope stability: slope prediction and forecast on stability analysis

    Institute of Scientific and Technical Information of China (English)

    Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG

    2006-01-01

    Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.

  11. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    Energy Technology Data Exchange (ETDEWEB)

    Kolski, Jeffrey S. [Los Alamos National Laboratory; Macek, Robert J. [Los Alamos National Laboratory; McCrady, Rodney C. [Los Alamos National Laboratory; Pang, Xiaoying [Los Alamos National Laboratory

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  12. Monte Carlo full-waveform inversion of crosshole GPR data using multiple-point geostatistical a priori information

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2012-01-01

    We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... into account during the inversion. The suggested inversion strategy is tested on synthetic tomographic crosshole ground-penetrating radar full-waveform data using multiple-point-based a priori information. This is, to our knowledge, the first example of obtaining a posteriori realizations of a full......-waveform inverse problem. Benefits of the proposed methodology compared with deterministic inversion approaches include: (1) The a posteriori model variability reflects the states of information provided by the data uncertainties and a priori information, which provides a means of obtaining resolution analysis. (2...

  13. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  14. Structure analysis of interstellar clouds: II. Applying the Delta-variance method to interstellar turbulence

    CERN Document Server

    Ossenkopf, V; Stutzki, J

    2008-01-01

    The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. In paper I we proposed essential improvements to the Delta-variance analysis. In this paper we apply the improved Delta-variance analysis to i) a hydrodynamic turbulence simulation with prominent density and velocity structures, ii) an observed intensity map of rho Oph with irregular boundaries and variable uncertainties of the different data points, and iii) a map of the turbulent velocity structure in the Polaris Flare affected by the intensity dependence on the centroid velocity determination. The tests confirm the extended capabilities of the improved Delta-variance analysis. Prominent spatial scales were accurately identified and artifacts from a variable reliability of the data were removed. The analysis of the hydrodynamic simulations showed that the injection of a turbulent velocity structure creates the most prominent density structures are produced on a sca...

  15. INTRODUCTION OF SPATIAL INTERPOLATION METHODS IN GEOSTATISTICS ANALYST%Geostatistics Analyst中空间内插方法的介绍

    Institute of Scientific and Technical Information of China (English)

    秦涛

    2005-01-01

    主流GIS软件ArcGIS 9的Geostatistics Analyst模块中所涉及的两大类空间内插方法:确定性内插方法和地统计内插方法.结合该软件对各种插值方法的应用和处理进行了介绍,应用示例比较各种内插方法的适用范围.

  16. A space-time geostatistical framework for ensemble nowcasting using rainfall radar fields and gauge data

    Science.gov (United States)

    Caseri, Angelica; Ramos, Maria Helena; Javelle, Pierre; Leblois, Etienne

    2016-04-01

    Floods are responsible for a major part of the total damage caused by natural disasters. Nowcasting systems providing public alerts to flash floods are very important to prevent damages from extreme events and reduce their socio-economic impacts. The major challenge of these systems is to capture high-risk situations in advance, with good accuracy in the intensity, location and timing of future intense precipitation events. Flash flood forecasting has been studied by several authors in different affected areas. The majority of the studies combines rain gauge data with radar imagery advection to improve prediction for the next few hours. Outputs of Numerical Weather Prediction (NWP) models have also been increasingly used to predict ensembles of extreme precipitation events that might trigger flash floods. One of the challenges of the use of NWP for ensemble nowcasting is to successfully generate ensemble forecasts of precipitation in a short time calculation period to enable the production of flood forecasts with sufficient advance to issue flash flood alerts. In this study, we investigate an alternative space-time geostatistical framework to generate multiple scenarios of future rainfall for flash floods nowcasting. The approach is based on conditional simulation and an advection method applied within the Turning Bands Method (TBM). Ensemble forecasts of precipitation fields are generated based on space-time properties given by radar images and precipitation data collected from rain gauges during the development of the rainfall event. The results show that the approach developed can be an interesting alternative to capture precipitation uncertainties in location and intensity and generate ensemble forecasts of rainfall that can be useful to improve alerts for flash floods, especially in small areas.

  17. Geostatistical Characteristics of the Structure of Spatial Variation of Electrical Power in the National 110 KV Network Including Results of Variogram Model Components Filtering

    Directory of Open Access Journals (Sweden)

    Barbara Namysłowska-Wilczyńska

    2015-03-01

    Full Text Available The paper provides results of analysing the superficial variability of electrical power using two geostatistical methods – lognormal kriging and ordinary kriging. The research work was to provide detailed characterization and identification of the electrical load variability structure at nodes of a 110 kV network over the whole territory of Poland having been analyzed on the basis of results from kriging techniques applied. The paper proposes the methodology using two techniques of modelling and estimating average values Z* of electrical powers, i.e. lognormal kriging and ordinary kriging. The input data for calculations were electrical powers at nodes of 110 kV network related to the same time moment, i.e. 11:00 a.m. in summer and winter seasons. Kriging calculations were made for various variants of examinations. Filtering was carried out for assumed complex theoretical models of semivariograms of electrical powers, which means their dividing into several models components of the covariance (nugget effect, 1 spherical model, 2 spherical model, which were filtered out successively. Then, estimations were made for average values Z* of powers while particular components are passed over. The results of analyses made with considering particular components of semivariograms models were shown in raster maps providing distributions of estimated average values Z* of electrical powers. This allowed the orientation of variations in values of this parameter, both over the territory of the whole country and in time domain, for two seasons – summer and winter, and also when various models components were assumed of semivariograms of the loads. Detailed analysis of spatial-time variability of average values Z* of electrical loads over the country allowed to identify their range and nature of variability.

  18. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China.

    Science.gov (United States)

    Zhong, Buqing; Liang, Tao; Wang, Lingqing; Li, Kexin

    2014-08-15

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management.

  19. Guidelines for depth data collection in rivers when applying interpolation techniques (kriging for river restoration

    Directory of Open Access Journals (Sweden)

    M. Rivas-Casado

    2007-05-01

    Full Text Available River restoration appraisal requires the implementation of monitoring programmes that assess the river site before and after the restoration project. However, little work has yet been developed to design effective and efficient sampling strategies. Three main variables need to be considered when designing monitoring programmes: space, time and scale. The aim of this paper is to describe the methodology applied to analyse the variation of depth in space, scale and time so more comprehensive monitoring programmes can be developed. Geostatistical techniques were applied to study the spatial dimension (sampling strategy and density, spectral analysis was used to study the scale at which depth shows cyclic patterns, whilst descriptive statistics were used to assess the temporal variation. A brief set of guidelines have been summarised in the conclusion.

  20. A Review of Temporal Aspects of Hand Gesture Analysis Applied to Discourse Analysis and Natural Conversation

    Directory of Open Access Journals (Sweden)

    Renata C. B. Madeo

    2013-08-01

    Full Text Available Lately, there has been an increasinginterest in hand gesture analysis systems. Recent works have employedpattern recognition techniques and have focused on the development of systems with more natural userinterfaces. These systems may use gestures to control interfaces or recognize sign language gestures, whichcan provide systems with multimodal interaction; or consist in multimodal tools to help psycholinguists tounderstand new aspects of discourse analysis and to automate laborious tasks.Gestures are characterizedby several aspects, mainly by movementsand sequence of postures. Since data referring to movementsorsequencescarry temporal information, this paper presents aliteraturereviewabouttemporal aspects ofhand gesture analysis, focusing on applications related to natural conversation and psycholinguisticanalysis, using Systematic Literature Review methodology. In our results, we organized works according totype of analysis, methods, highlighting the use of Machine Learning techniques, and applications.

  1. Reservoir Characterization using geostatistical and numerical modeling in GIS with noble gas geochemistry

    Science.gov (United States)

    Vasquez, D. A.; Swift, J. N.; Tan, S.; Darrah, T. H.

    2013-12-01

    The integration of precise geochemical analyses with quantitative engineering modeling into an interactive GIS system allows for a sophisticated and efficient method of reservoir engineering and characterization. Geographic Information Systems (GIS) is utilized as an advanced technique for oil field reservoir analysis by combining field engineering and geological/geochemical spatial datasets with the available systematic modeling and mapping methods to integrate the information into a spatially correlated first-hand approach in defining surface and subsurface characteristics. Three key methods of analysis include: 1) Geostatistical modeling to create a static and volumetric 3-dimensional representation of the geological body, 2) Numerical modeling to develop a dynamic and interactive 2-dimensional model of fluid flow across the reservoir and 3) Noble gas geochemistry to further define the physical conditions, components and history of the geologic system. Results thus far include using engineering algorithms for interpolating electrical well log properties across the field (spontaneous potential, resistivity) yielding a highly accurate and high-resolution 3D model of rock properties. Results so far also include using numerical finite difference methods (crank-nicholson) to solve for equations describing the distribution of pressure across field yielding a 2D simulation model of fluid flow across reservoir. Ongoing noble gas geochemistry results will also include determination of the source, thermal maturity and the extent/style of fluid migration (connectivity, continuity and directionality). Future work will include developing an inverse engineering algorithm to model for permeability, porosity and water saturation.This combination of new and efficient technological and analytical capabilities is geared to provide a better understanding of the field geology and hydrocarbon dynamics system with applications to determine the presence of hydrocarbon pay zones (or

  2. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  3. TAPPS Release 1: Plugin-Extensible Platform for Technical Analysis and Applied Statistics

    Directory of Open Access Journals (Sweden)

    Justin Sam Chew

    2016-01-01

    Full Text Available We present the first release of TAPPS (Technical Analysis and Applied Statistics System; a Python implementation of a thin software platform aimed towards technical analyses and applied statistics. The core of TAPPS is a container for 2-dimensional data frame objects and a TAPPS command language. TAPPS language is not meant to be a programming language for script and plugin development but for the operational purposes. In this aspect, TAPPS language takes on the flavor of SQL rather than R, resulting in a shallower learning curve. All analytical functions are implemented as plugins. This results in a defined plugin system, which enables rapid development and incorporation of analysis functions. TAPPS Release 1 is released under GNU General Public License 3 for academic and non-commercial use. TAPPS code repository can be found at http://github.com/mauriceling/tapps.

  4. Common cause evaluations in applied risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system

  5. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    Science.gov (United States)

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  6. Applied behavior analysis as intervention for autism: definition, features and philosophical concepts

    Directory of Open Access Journals (Sweden)

    Síglia Pimentel Höher Camargo

    2013-11-01

    Full Text Available Autism spectrum disorder (ASD is a lifelong pervasive developmental disorder with no known causes and cure. However, educational and behavioral interventions with a foundation in applied behavior analysis (ABA have been shown to improve a variety of skill areas such as communication, social, academic, and adaptive behaviors of individuals with ASD. The goal of this work is to present the definition, features and philosophical concepts that underlie ABA and make this science an effective intervention method for people with autism.

  7. High-resolution frequency analysis as applied to the singing voice.

    Science.gov (United States)

    Morsomme, D; Remacle, M; Millet, B

    1993-01-01

    We have applied high-resolution vocal frequent analysis to a population of singing voices. Two important elements have become apparent: (1) Confirmation that the singing formant originates in the resonators. This is observed especially on a low fundamental, and it is acquired through technical skill and experience. (2) Observation of the vibrato, which, isolated from the clinical study, regarding only its graphic presentation, could have been interpreted as 'abnormal'. PMID:8253452

  8. Stability analysis of multi-infeed HVDC system applying VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe

    2010-01-01

    on this model is analyzed under steady state and different kinds of grid fault transient situations. Two main control methods applied on VSC-HVDC link in this dual infeed HVDC system are investigated, and comparative analysis of them under transient situation is presented. A simulation model is built in PSCAD....../EMTDC to verify the theoretical analysis. Simulation results indicate that this dual infeed HVDC system can realize higher stability than single infeed HVDC system. And different control strategies on a VSC-HVDC link may result in different influence on AC voltage and active power oscillation during transient...... situation....

  9. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis...... the plausibility of the model results. With respect to Pd resource management, improved formal collection of end-of-life (EOL) consumer products is identified as a key factor in increasing the recycling efficiency. In particular, the partial export of EOL vehicles represents a substantial loss of Pd from...

  10. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author)

  11. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  12. Sensitivity and uncertainty analysis applied to a repository in rock salt

    International Nuclear Information System (INIS)

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOSECN by substituting the sampled values in the various input files to be used by EMOSECN; the model calculations for this repository were performed with the EMOSECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOSECN with the probabilistic input data. For post-processing the EMOSECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOSECN runs have been analyzed. (orig.)

  13. Sequential analysis applied to clinical trials in dentistry: a systematic review.

    Science.gov (United States)

    Bogowicz, P; Flores-Mir, C; Major, P W; Heo, G

    2008-01-01

    Clinical trials employ sequential analysis for the ethical and economic benefits it brings. In dentistry, as in other fields, resources are scarce and efforts are made to ensure that patients are treated ethically. The objective of this systematic review was to characterise the use of sequential analysis for clinical trials in dentistry. We searched various databases from 1900 through to January 2008. Articles were selected for review if they were clinical trials in the field of dentistry that had applied some form of sequential analysis. Selection was carried out independently by two of the authors. We included 18 trials from various specialties, which involved many different interventions. We conclude that sequential analysis seems to be underused in this field but that there are sufficient methodological resources in place for future applications.Evidence-Based Dentistry (2008) 9, 55-62. doi:10.1038/sj.ebd.6400587. PMID:18584009

  14. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    Science.gov (United States)

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  15. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  16. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  17. 地质统计学在固体矿山中的应用%Application of Geostatistics in Solid Mine

    Institute of Scientific and Technical Information of China (English)

    刘焕荣; 燕永锋; 杨海涛

    2013-01-01

    As a rising and crossed subject,the geostatistics has obtained the development greatly during nearly 50 years of research and practice,which was also known as spatial information statistics in recent years.Producing practice,both at home and abroad,showed that geostatistics study has obvious advantages in geoscience,and has been greatly used in the study of solid mine.This paper mainly introduced the geostatistics applied in many ways such as the calculation of the re-serves of the mineral resources,the distributional characteristics of the minerals,the classification of the reserves,the opti-mization of the determine exploratory grid and the investigation of the minerals.%地质统计学作为一门新兴的交叉学科,在近50年的研究和实践中得到了很大的发展,近年来又被称为空间信息统计学。国内外的生产实践表明,地质统计学除了在地学科研方面具有明显的优越性,在固体矿山中的应用也越来越广泛。本文主要介绍了地质统计学在矿产资源储量计算、矿产分布特征、储量分类、勘探网度优化及矿产勘查等方面的应用。

  18. The assessment of spatial correlation between location of deposits and faults using geostatistical methods: case study, Yazd province

    Directory of Open Access Journals (Sweden)

    Mostafa Dehghani Ahmadabad

    2013-10-01

    Full Text Available Determining the promising area for ore deposits is one of the most important steps of prospecting in regional scales. There are many different methods for identifying these areas including geochemical and geophysical methods, remote sensing and sophisticated statistical methods. Based on the theory of spatial relations between the dispersion pattern of ore deposits and metallogenic provinces, mineralization belts, faults and structural factors, some new interpretative methods can be proposed in the preliminary exploration phase of the potential areas. In this study, the geostatistical methods were used, where the spatial location of faults and known metallic deposits were considered as the primary source of data to obtain their correlation (case study of metallic deposits in Yazd province. The research was performed on 807 major and minor faults and 76 metallic deposits, mainly from hydrothermal origins. The data was arranged in ArcViewGIS software environment. The geostatistical analysis was performed by defining the regionalized variable (distance between faults and deposits in a Mathematica subroutine. Variography operations, in order to find the spatial structure, were performed on regionalized variable using Surpac software. It was also shown that the theory of spatial correlation was valid for the defined variable. In this work, the variography operation was used to find the direction and the range of the effect of faults and deposits. Variograms indicated that the possibility of ore deposits existence in an area could depend on the direction of the faults. By drawing the directional variogram and variogram map, the best stretches for more exploratory studies is shown to be parallel along the Azimuth 130◦ and 64 km distance. By revealing the spatial structure in different directions, the area of mineralization related to the faults and the number of ore deposits associated with major faults have been marked.

  19. Evaluation of spatial variability of metal bioavailability in soils using geostatistics

    DEFF Research Database (Denmark)

    Owsianiak, Mikolaj; Hauschild, Michael Zwicky; Rosenbaum, Ralph K.

    2012-01-01

    is performed using ArcGIS Geostatistical Analyst. Results show that BFs of copper span a range of 6 orders of magnitude, and have signifficant spatial variability at local and continental scales. The model nugget variance is signifficantly higher than zero, suggesting the presence of spatial variability...

  20. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    NARCIS (Netherlands)

    S.J. O’Hanlon (Simon J.); H.C. Slater (Hannah C.); R.A. Cheke (Robert A.); B.A. Boatin; L.E. Coffeng (Luc); S.D.S. Pion (Sébastien); M. Boussinesq (Michel); H.G.M. Zouré (Honorat G.); W.A. Stolk (Wilma); M-G. Basáñez (María-Gloria)

    2016-01-01

    textabstractBackground: The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis

  1. Reliability analysis of protection systems in NPP applying fault-tree analysis method

    International Nuclear Information System (INIS)

    This paper demonstrates the applicability and limits of dependability analysis in nuclear power plants (NPPS) based on the reactor protection refurbishment project (RRP) in NPP Paks. This paper illustrates case studies from the reliability analysis for NPP Paks. It also investigates the solutions for the connection between the data acquisition and subsystem control units (TSs) and the voter units (VTs), it analyzes the influence of the voting in the VT computer level, it studies the effects of the testing procedures to the dependability parameters. (author)

  2. Geostatistical investigation into the temporal evolution of spatial structure in a shallow water table

    Directory of Open Access Journals (Sweden)

    S. W. Lyon

    2006-01-01

    Full Text Available Shallow water tables near-streams often lead to saturated, overland flow generating areas in catchments in humid climates. While these saturated areas are assumed to be principal biogeochemical hot-spots and important for issues such as non-point pollution sources, the spatial and temporal behavior of shallow water tables, and associated saturated areas, is not completely understood. This study demonstrates how geostatistical methods can be used to characterize the spatial and temporal variation of the shallow water table for the near-stream region. Event-based and seasonal changes in the spatial structure of the shallow water table, which influences the spatial pattern of surface saturation and related runoff generation, can be identified and used in conjunction to characterize the hydrology of an area. This is accomplished through semivariogram analysis and indicator kriging to produce maps combining soft data (i.e., proxy information to the variable of interest representing general shallow water table patterns with hard data (i.e., actual measurements that represent variation in the spatial structure of the shallow water table per rainfall event. The area used was a hillslope in the Catskill Mountains region of New York State. The shallow water table was monitored for a 120 m×180 m near-stream region at 44 sampling locations on 15-min intervals. Outflow of the area was measured at the same time interval. These data were analyzed at a short time interval (15 min and at a long time interval (months to characterize the changes in the hydrologic behavior of the hillslope. Indicator semivariograms based on binary-transformed ground water table data (i.e., 1 if exceeding the time-variable median depth to water table and 0 if not were created for both short and long time intervals. For the short time interval, the indicator semivariograms showed a high degree of spatial structure in the shallow water table for the spring, with increased range

  3. Criticality analysis of thermal reactors for two energy groups applying Monte Carlo and neutron Albedo method

    International Nuclear Information System (INIS)

    The Albedo method applied to criticality calculations to nuclear reactors is characterized by following the neutron currents, allowing to make detailed analyses of the physics phenomena about interactions of the neutrons with the core-reflector set, by the determination of the probabilities of reflection, absorption, and transmission. Then, allowing to make detailed appreciations of the variation of the effective neutron multiplication factor, keff. In the present work, motivated for excellent results presented in dissertations applied to thermal reactors and shieldings, was described the methodology to Albedo method for the analysis criticality of thermal reactors by using two energy groups admitting variable core coefficients to each re-entrant current. By using the Monte Carlo KENO IV code was analyzed relation between the total fraction of neutrons absorbed in the core reactor and the fraction of neutrons that never have stayed into the reflector but were absorbed into the core. As parameters of comparison and analysis of the results obtained by the Albedo method were used one dimensional deterministic code ANISN (ANIsotropic SN transport code) and Diffusion method. The keff results determined by the Albedo method, to the type of analyzed reactor, showed excellent agreement. Thus were obtained relative errors of keff values smaller than 0,78% between the Albedo method and code ANISN. In relation to the Diffusion method were obtained errors smaller than 0,35%, showing the effectiveness of the Albedo method applied to criticality analysis. The easiness of application, simplicity and clarity of the Albedo method constitute a valuable instrument to neutronic calculations applied to nonmultiplying and multiplying media. (author)

  4. Clastic reservoir porosity mapping using seismic data and geostatistics: Callovian unit, West Lulu field

    Energy Technology Data Exchange (ETDEWEB)

    Vejbaek, O.V.

    1998-12-31

    The aim of this report was to demonstrate possible uses of seismic impedances as soft data for reservoir characterization. To illustrate the impact of the results and attempt to calculate oil in place was also carried out. It must, however, be emphasized that these results only apply to the Callovian portion of the Middle Jurassic West Lulu reservoir, and thus do not provide estimates of the entire Middle Jurassic West Lulu accumulation. It is important to realise that stochastic simulations does not provide exact predictions in areas outside the control of hard data. It is, however, offering possibilities to exploit every known or surmised property about the desired (target) data population. These properties include f.ex., mean, spread, spatial continuity (measured by variograms), horixontal and vertical trends, correlation to supporting soft data (e.g. seismic impedances) etc. Neither are predictions exact even through the term `narrowed solution space` is applied. This term merely implies that the error in prediction at any point may be less than the full range of the parameter. The quality of the predictions mainly depend on meticulous handling of data, avoiding errors like bad stratigraphic alignment of the data, obtaining good variograms, avoiding errors in the construction of the target populations and including every pertinent attribute about the data. The result is thus also depending on a full geological understanding of the problem (and moral of the modeller). The most important quality is the ability to provide a great number of equi-probable realisation that equally well satisfies any known or surmised property about the target data population. The goal of this study was to investigate the use of inversion derived seismic impedances for geostatistical reservoir characterisation in a complex clastic reservoir exemplified with the West Lulu reservoir of the Harald Field. The well database is rather modest, so substantial support has been gained from the

  5. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    Directory of Open Access Journals (Sweden)

    J.D. Clayton

    2016-08-01

    Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.

  6. Neutron activation analysis as applied to instrumental analysis of trace elements from seawater

    International Nuclear Information System (INIS)

    Particulate matter collected from the coastal area delimited by the mouth of the river Volturno and the Sabaudia lake has been analyzed by instrumental neutron activation analysis for its content of twenty-two trace elements. The results for surface water and bottom water are reported separately, thus evidencing the effect of sampling depth on the concentration of many elements. The necessity of accurately 'cleaning' the filters before use is stressed

  7. Further development and application of GIS technology and geostatistics to assist in the exploration for uranium in western North America

    International Nuclear Information System (INIS)

    Microcomputer technology has been applied in a multidisciplinary approach to uranium exploration in a study area in the Northern Rocky Mountain Physiographic Province. Several techniques for multi-data set manipulation were evaluated and the problem of varying intrinsic data resolutions considered. Geostatistical methods, including two-dimensional variogramming applied to geochemical and geophysical data sets yielded interesting results which were used to outline zones of influence and to predict the occurrence of otherwise of anomalies. Image processing methods were applied to TM, geophysical, geochemical and topographic data sets in the context of a simple image based GIS system. Several unreported geological features related to uranium mineralization were found. The study data set has been integrated on local scale and in relation to regional magnetic and gravity images. GIS-style studies suggest a combination of deeply penetrating zones of weakness, deepseated igneous activity, and hydrothermal activity to produce most of the known uranium occurrences in the test area. Based on the these ideas, several target areas were delineated for further investigation

  8. Integrated software for imaging data analysis applied to edge plasma physic and operational safety

    Energy Technology Data Exchange (ETDEWEB)

    Martin, V.; Moncada, V. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Dunand, G. [Sophia Conseil Company, F-06560 Sophia Antipolis (France); Corre, Y.; Delchambre, E. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Travere, J.M., E-mail: jean-marcel.travere@cea.fr [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France)

    2011-06-15

    Fusion tokamaks are complex devices requiring many diagnostics for real time control of the plasma and off-line physical analysis. In current tokamaks, imaging diagnostics have become increasingly used for these two purposes. Such systems produce a lot of data encouraging physicists to use shared tools and codes for data access and analysis. If general purpose software programs for data display and analysis are widely spread, a need exists in developing similar applications for quantitative imaging data analysis applied to plasma physic. In this paper, we introduce a new integrated software program, named WOLFF, dedicated to this task. The main contribution of this software is to gather under the same framework different functionalities for (1) data access and display, (2) signal, image, and video processing, and (3) quantitative analysis based on physical models. After an overview of existing solutions for data processing in the field of plasma data, we present the WOLFF architecture and its currently implemented features. The capabilities of the software are then demonstrated through three applications in the field of physical analysis (heat and particle flux calculations) and tokamak operational safety.

  9. Geostatistical independent simulation of spatially correlated soil variables

    Science.gov (United States)

    Boluwade, Alaba; Madramootoo, Chandra A.

    2015-12-01

    The selection of best management practices to reduce soil and water pollution often requires estimation of soil properties. It is important to find an efficient and robust technique to simulate spatially correlated soils parameters. Co-kriging and co-simulation are techniques that can be used. These methods are limited in terms of computer simulation due to the problem of solving large co-kriging systems and difficulties in fitting a valid model of coregionalization. The order of complexity increases as the number of covariables increases. This paper presents a technique for the conditional simulation of a non-Gaussian vector random field on point support scale. The technique is termed Independent Component Analysis (ICA). The basic principle underlining ICA is the determination of a linear representation of non-Gaussian data so that the components are considered statistically independent. With such representation, it would be easy and more computationally efficient to develop direct variograms for the components. The process is presented in two stages. The first stage involves the ICA decomposition. The second stage involves sequential Gaussian simulation of the generated components (which are derived from the first stage). This technique was applied for spatially correlated extractable cations such as magnesium (Mg) and iron (Fe) in a Canadian watershed. This paper has a strong application in stochastic quantification of uncertainties of soil attributes in soil remediation and soil rehabilitation.

  10. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    Science.gov (United States)

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically.

  11. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    Science.gov (United States)

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically. PMID:27295000

  12. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  13. The Renormalization-Group Method Applied to Asymptotic Analysis of Vector Fields

    CERN Document Server

    Kunihiro, T

    1996-01-01

    The renormalization group method of Goldenfeld, Oono and their collaborators is applied to asymptotic analysis of vector fields. The method is formulated on the basis of the theory of envelopes, as was done for scalar fields. This formulation actually completes the discussion of the previous work for scalar equations. It is shown in a generic way that the method applied to equations with a bifurcation leads to the Landau-Stuart and the (time-dependent) Ginzburg-Landau equations. It is confirmed that this method is actually a powerful theory for the reduction of the dynamics as the reductive perturbation method is. Some examples for ordinary diferential equations, such as the forced Duffing, the Lotka-Volterra and the Lorenz equations, are worked out in this method: The time evolution of the solution of the Lotka-Volterra equation is explicitly given, while the center manifolds of the Lorenz equation are constructed in a simple way in the RG method.

  14. Analysis of possibility of applying the PVDF foil in industrial vibration sensors

    Science.gov (United States)

    Wróbel, A.

    2015-11-01

    There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.

  15. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    Science.gov (United States)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  16. Applied behavior analysis treatment of autism: the state of the art.

    Science.gov (United States)

    Foxx, Richard M

    2008-10-01

    The treatment of individuals with autism is associated with fad, controversial, unsupported, disproven, and unvalidated treatments. Eclecticism is not the best approach for treating and educating children and adolescents who have autism. Applied behavior analysis (ABA) uses methods derived from scientifically established principles of behavior and incorporates all of the factors identified by the US National Research Council as characteristic of effective interventions in educational and treatment programs for children who have autism. ABA is a primary method of treating aberrant behavior in individuals who have autism. The only interventions that have been shown to produce comprehensive, lasting results in autism have been based on the principles of ABA. PMID:18775372

  17. Advantages and Drawbacks of Applying Periodic Time-Variant Modal Analysis to Spur Gear Dynamics

    DEFF Research Database (Denmark)

    Pedersen, Rune; Santos, Ilmar; Hede, Ivan Arthur

    2010-01-01

    A simplified torsional model with a reduced number of degrees-of-freedom is used in order to investigate the potential of the technique. A time-dependent gear mesh stiffness function is introduced and expanded in a Fourier series. The necessary number of Fourier terms is determined in order...... to ensure sufficient accuracy of the results. The method of time-variant modal analysis is applied, and the changes in the fundamental and the parametric resonance frequencies as a function of the rotational speed of the gears, are found. By obtaining the stationary and parametric parts of the time...

  18. Improving detection of differentially expressed gene sets by applying cluster enrichment analysis to Gene Ontology

    Directory of Open Access Journals (Sweden)

    Gu JianLei

    2009-08-01

    Full Text Available Abstract Background Gene set analysis based on Gene Ontology (GO can be a promising method for the analysis of differential expression patterns. However, current studies that focus on individual GO terms have limited analytical power, because the complex structure of GO introduces strong dependencies among the terms, and some genes that are annotated to a GO term cannot be found by statistically significant enrichment. Results We proposed a method for enriching clustered GO terms based on semantic similarity, namely cluster enrichment analysis based on GO (CeaGO, to extend the individual term analysis method. Using an Affymetrix HGU95aV2 chip dataset with simulated gene sets, we illustrated that CeaGO was sensitive enough to detect moderate expression changes. When compared to parent-based individual term analysis methods, the results showed that CeaGO may provide more accurate differentiation of gene expression results. When used with two acute leukemia (ALL and ALL/AML microarray expression datasets, CeaGO correctly identified specifically enriched GO groups that were overlooked by other individual test methods. Conclusion By applying CeaGO to both simulated and real microarray data, we showed that this approach could enhance the interpretation of microarray experiments. CeaGO is currently available at http://chgc.sh.cn/en/software/CeaGO/.

  19. Escalation research: providing new frontiers for applying behavior analysis to organizational behavior.

    Science.gov (United States)

    Goltz, S M

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to "throw good money after bad," can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis.

  20. Graphical Analysis of PET Data Applied to Reversible and Irreversible Tracers

    Energy Technology Data Exchange (ETDEWEB)

    Logan, Jean

    1999-11-18

    Graphical analysis refers to the transformation of multiple time measurements of plasma and tissue uptake data into a linear plot, the slope of which is related to the number of available tracer binding sites. This type of analysis allows easy comparisons among experiments. No particular model structure is assumed, however it is assumed that the tracer is given by bolus injection and that both tissue uptake and the plasma concentration of unchanged tracer are monitored following tracer injection. The requirement of plasma measurements can be eliminated in some cases when a reference region is available. There are two categories of graphical methods which apply to two general types of ligands--those which bind reversibly during the scanning procedure and those which are irreversible or trapped during the time of the scanning procedure.

  1. Analysis of the concept of nursing educational technology applied to the patient

    Directory of Open Access Journals (Sweden)

    Aline Cruz Esmeraldo Áfio

    2014-04-01

    Full Text Available It is aimed at analyzing the concept of educational technology, produced by nursing, applied to the patient. Rodgers´ Evolutionary Method of Concept Analysis was used, identifying background, attributes and consequential damages. 13 articles were selected for analysis in which the background was identified: knowledge deficiency, shortage of nursing professionals' time, to optimize nursing work, the need to achieve the goals of the patients. Attributes: tool, strategy, innovative approach, pedagogical approach, mediator of knowledge, creative way to encourage the acquisition of skills, health production instrument. Consequences: to improve the quality of life, encouraging healthy behavior, empowerment, reflection and link. It emphasizes the importance of educational technologies for the care in nursing, to boost health education activities.

  2. Functional analysis and applied optimization in Banach spaces applications to non-convex variational models

    CERN Document Server

    Botelho, Fabio

    2014-01-01

    This book introduces the basic concepts of real and functional analysis. It presents the fundamentals of the calculus of variations, convex analysis, duality, and optimization that are necessary to develop applications to physics and engineering problems. The book includes introductory and advanced concepts in measure and integration, as well as an introduction to Sobolev spaces. The problems presented are nonlinear, with non-convex variational formulation. Notably, the primal global minima may not be attained in some situations, in which cases the solution of the dual problem corresponds to an appropriate weak cluster point of minimizing sequences for the primal one. Indeed, the dual approach more readily facilitates numerical computations for some of the selected models. While intended primarily for applied mathematicians, the text will also be of interest to engineers, physicists, and other researchers in related fields.

  3. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  4. Geostatistical characterization of the Callovo-Oxfordian clay variability: from conventional and high resolution log data

    International Nuclear Information System (INIS)

    Andra (National Radioactive Waste Management Agency) has conducted studies in its Meuse/Haute-Marne Underground Research Laboratory located at a depth of about 490 m in a 155-million-year-old argillaceous rock: the Callovo-Oxfordian argillite. The purpose of the present work is to obtain as much information as possible from high-resolution log data and to optimize their analysis to specify and characterize space-time variations of the argillites from the Meuse/Haute-Marne site and subsequently predict the evolution of argillite properties on a 250 km2 zone around the underground laboratory (transposition zone). The aim is to outline a methodology to transform depth intervals into geological time intervals and thus to quantify precisely the sedimentation rate variation, estimate duration; for example the duration of bio-stratigraphical units or of hiatuses. The latter point is particularly important because a continuous time recording is often assumed in geological modelling. The spatial variations can be studied on various scales. First, well-to-well correlations are established between seven wells at different scales. Relative variations of the thickness are observed locally. Second, FMI (Full-bore Formation Micro-Imager, Schlumberger) data are studied in detail to extract as much information as possible. For example, the analysis of FMI images reveals a clear carbonate - clay inter-bedding which displays cycles. Third, geostatistical tools are used to study these cycles. The vario-graphic analysis of conventional log data shows one metre cycles. With FMI data, smaller periods can be detected. Variogram modelling and factorial kriging analysis suggest that three spatial periods exist. They vary vertically and laterally in the boreholes but cycle ratios are stable and similar to orbital-cycle ratios (Milankovitch cycles). The three periods correspond to eccentricity, obliquity and precession. Since the duration of these orbital cycles is known, depth intervals can

  5. 多点地质统计学研究进展与展望%Progress and prospect of multiple-point geostatistics

    Institute of Scientific and Technical Information of China (English)

    尹艳树; 张昌民; 李玖勇; 石书缘

    2011-01-01

    在简要回顾多点地质统计学起源后,介绍了多点地质统计学的3种方法,并总结了多点地质统计学的研究进展.在应用领域,已经从河流相建模发展到扇环境建模,从储集层结构建模发展到储集层物性分布模拟,从宏观地质体预测发展到微观孔喉分布建模,从地质研究发展到地质统计反演.在整合信息建模方面,给出了3种综合地震属性的方法;在算法方面,提出了PRTT实时处理方法,完善了多点地质统计学建模,并开发了新的多点统计生长算法(Growthsim).对多点地质统计学未来发展进行了展望,指出在训练图像、数据综合以及建模方法耦合方面还需要进一步深入研究.%The paper summarizes the progress of multiple point geostatistics. First the origin of multiple point geostatistics is introduced and the theories of three main multiple-point geostatistic methods are analyzed. Then, the development status is concluded in three aspects. Firstly, in real reservoir modeling domain, the modeling environment is from fluvial to fan facies; the modeling content is from reservoir architecture to reservoir petrophysical property; the modeling scale is from large geologic deposits to micro pores and throats; the modeling region is from geological modeling to geological statistic inversion. Secondly, in integrating multi-disciplines modeling domain, there are three methods for integrating seismic data. Thirdly, in modeling methods domain, there are some improvements and new methods such as PRTT and the Growthsim. Based on the analysis of the development of multiple point geostatistics, the paper points out that the training image, data integration and modeling coupration are the main aims in further studies.

  6. Research on the Spatial Interpolation of Concentrations of Carbon Dioxide Based on the Geostatistics%基于地统计方法的二氧化碳浓度空间插值研究

    Institute of Scientific and Technical Information of China (English)

    陈锦赋

    2012-01-01

    克里格法(Kriging)是地质统计学的主要内容之一,从统计意义上说,是从变量关系和变异性出发,在有限区域内对区域化变量的取值进行无偏、最优估计的一种方法:从插值角度讲是对空间分布的数据求线性最优、无偏内插估计的一种方法。克里格法的适用条件是区域化变量存在空间相关性。将临安市内的二氧化碳作为区域化变量.根据不同的半变异函数理论模型,采用普通Kriging法,通过对比分析得到:基于地统计的插值方法。根据半变异函数云图和试验方差最小的原理.选择合适的半变异函数理论模型进行变量的空间插值.能够较好地模拟区域化变量的空间连续分布格局.并取得较好效果。%Kriging is the main content of geostatistics, from the statistical sense, Kriging start from the variables and variability, a method of unbiased and the optimal estimation in a limited area about the value of regionalized variables. From the interpolation sense, Kriging is a method of linear optimal and Unbiased interpolation estimate to the space distribution of data. The appli- cable condition of Kriging is that regionalized variables have spatial correlation. With the carbon dioxide of Linan as regionalized variables, according to different half variation functions theory model, adopts the common method of Kriging, through the comparative analysis we can get : the interpolation method based on the geostatistics, according to the theory of Half a variation functions cloud chart and the minimum test variance, chooses the right half of the variograms model to spatial interpolation for variable, can simulate the space continuous distribution pattern of regionalized variables, and gets a good effect.

  7. Increasing the predictive power of geostatistical reservoir models by integration of geological constraints from stratigraphic forward modeling

    NARCIS (Netherlands)

    Sacchi, Q.; Borello, E.S.; Weltje, G.J.; Dalman, R.

    2016-01-01

    Current static reservoir models are created by quantitative integration of interpreted well and seismic data through geostatistical tools. In these models, equiprobable realizations of structural settings and property distributions can be generated by stochastic simulation techniques. The integratio

  8. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  9. Performance Analysis of Neuro Genetic Algorithm Applied on Detecting Proportion of Components in Manhole Gas Mixture

    Directory of Open Access Journals (Sweden)

    Varun Kumar Ojha

    2012-08-01

    Full Text Available The article presents performance analysis of a real valued neuro genetic algorithm applied for thedetection of proportion of the gases found in manhole gas mixture. The neural network (NN trained usinggenetic algorithm (GA leads to concept of neuro genetic algorithm, which is used for implementing anintelligent sensory system for the detection of component gases present in manhole gas mixture Usually amanhole gas mixture contains several toxic gases like Hydrogen Sulfide, Ammonia, Methane, CarbonDioxide, Nitrogen Oxide, and Carbon Monoxide. A semiconductor based gas sensor array used for sensingmanhole gas components is an integral part of the proposed intelligent system. It consists of many sensorelements, where each sensor element is responsible for sensing particular gas component. Multiple sensorsof different gases used for detecting gas mixture of multiple gases, results in cross-sensitivity. The crosssensitivity is a major issue and the problem is viewed as pattern recognition problem. The objective of thisarticle is to present performance analysis of the real valued neuro genetic algorithm which is applied formultiple gas detection.

  10. Performance Analysis of Neuro Genetic Algorithm Applied on Detecting Proportion of Components in Manhole Gas Mixture

    Directory of Open Access Journals (Sweden)

    Varun Kumar Ojha

    2012-07-01

    Full Text Available The article presents performance analysis of a real valued neuro genetic algorithm applied for the detection of proportion of the gases found in manhole gas mixture. The neural network (NN trained using genetic algorithm (GA leads to concept of neuro genetic algorithm, which is used for implementing an intelligent sensory system for the detection of component gases present in manhole gas mixture Usually a manhole gas mixture contains several toxic gases like Hydrogen Sulfide, Ammonia, Methane, Carbon Dioxide, Nitrogen Oxide, and Carbon Monoxide. A semiconductor based gas sensor array used for sensing manhole gas components is an integral part of the proposed intelligent system. It consists of many sensor elements, where each sensor element is responsible for sensing particular gas component. Multiple sensors of different gases used for detecting gas mixture of multiple gases, results in cross-sensitivity. The crosssensitivity is a major issue and the problem is viewed as pattern recognition problem. The objective of this article is to present performance analysis of the real valued neuro genetic algorithm which is applied for multiple gas detection.

  11. Lévy scaling: The diffusion entropy analysis applied to DNA sequences

    Science.gov (United States)

    Scafetta, Nicola; Latora, Vito; Grigolini, Paolo

    2002-09-01

    We address the problem of the statistical analysis of a time series generated by complex dynamics with the diffusion entropy analysis (DEA) [N. Scafetta, P. Hamilton, and P. Grigolini, Fractals 9, 193 (2001)]. This method is based on the evaluation of the Shannon entropy of the diffusion process generated by the time series imagined as a physical source of fluctuations, rather than on the measurement of the variance of this diffusion process, as done with the traditional methods. We compare the DEA to the traditional methods of scaling detection and prove that the DEA is the only method that always yields the correct scaling value, if the scaling condition applies. Furthermore, DEA detects the real scaling of a time series without requiring any form of detrending. We show that the joint use of DEA and variance method allows to assess whether a time series is characterized by Lévy or Gauss statistics. We apply the DEA to the study of DNA sequences and prove that their large-time scales are characterized by Lévy statistics, regardless of whether they are coding or noncoding sequences. We show that the DEA is a reliable technique and, at the same time, we use it to confirm the validity of the dynamic approach to the DNA sequences, proposed in earlier work.

  12. Weighted correlation network analysis (WGCNA applied to the tomato fruit metabolome.

    Directory of Open Access Journals (Sweden)

    Matthew V DiLeo

    Full Text Available BACKGROUND: Advances in "omics" technologies have revolutionized the collection of biological data. A matching revolution in our understanding of biological systems, however, will only be realized when similar advances are made in informatic analysis of the resulting "big data." Here, we compare the capabilities of three conventional and novel statistical approaches to summarize and decipher the tomato metabolome. METHODOLOGY: Principal component analysis (PCA, batch learning self-organizing maps (BL-SOM and weighted gene co-expression network analysis (WGCNA were applied to a multivariate NMR dataset collected from developmentally staged tomato fruits belonging to several genotypes. While PCA and BL-SOM are appropriate and commonly used methods, WGCNA holds several advantages in the analysis of highly multivariate, complex data. CONCLUSIONS: PCA separated the two major genetic backgrounds (AC and NC, but provided little further information. Both BL-SOM and WGCNA clustered metabolites by expression, but WGCNA additionally defined "modules" of co-expressed metabolites explicitly and provided additional network statistics that described the systems properties of the tomato metabolic network. Our first application of WGCNA to tomato metabolomics data identified three major modules of metabolites that were associated with ripening-related traits and genetic background.

  13. AGRICULTURAL COOPERATIVE FIRMS: BUDGETARY ADJUSTMENTS AND ANALYSIS OF CREDIT ACCESS APPLYING SCORING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Giuseppe Bonazzi

    2014-01-01

    Full Text Available Cooperatives are one of the most important types of companies in the agricultural sector. Cooperatives allow overcoming the limitations of the fragmentation of agricultural property, increasing the level of production of small-sized farms and selling the product so that it reaches a sufficient critical mass. Moreover, cooperatives are often characterized by undercapitalization and even difficult credit access because banks conduct their analysis applying rating systems that do not take into account the typicality of the cooperative budget. To assess this topic, in this article, an analysis has been conducted on a sample of 100 cooperatives, making adjustments to the annual budget in order to consider the typicality of their annual accounts. The results of the analysis show that suggested adjustments allow a more correct expression of the economic results and capital adequacy of the cooperative and that the results, expressed in terms of scoring, are higher than that achieved by a traditional analysis. This methodology could improve the credit access capacity for agricultural cooperatives and then reduce financial constraints, particularly in developing countries.

  14. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment.

    Science.gov (United States)

    Yoo, Doo Han; Lee, Jae Shin

    2016-07-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.

  15. Common reduced spaces of representation applied to multispectral texture analysis in cosmetology

    Science.gov (United States)

    Corvo, Joris; Angulo, Jesus; Breugnot, Josselin; Borbes, Sylvie; Closs, Brigitte

    2016-03-01

    Principal Component Analysis (PCA) is a technique of multivariate data analysis widely used in various fields like biology, ecology or economy to reduce data dimensionality while retaining most important information. It is becoming a standard practice in multispectral/hyperspectral imaging since those multivariate data generally suffer from a high redundancy level. Nevertheless, by definition, PCA is meant to be applied to a single multispectral/hyperspectral image at a time. When several images have to be treated, running a PCA on each image would generate specific reduced spaces, which is not suitable for comparison between results. Thus, we focus on two PCA based algorithms that could define common reduced spaces of representation. The first method arises from literature and is computed with the barycenter covariance matrix. On the contrary, we designed the second algorithm with the idea of correcting standard PCA using permutations and inversions of eigenvectors. These dimensionality reduction methods are used within the context of a cosmetological study of a foundation make-up. Available data are in-vivo multispectral images of skin acquired on different volunteers in time series. The main purpose of this study is to characterize the make-up degradation especially in terms of texture analysis. Results have to be validate by statistical prediction of time since applying the product. PCA algorithms produce eigenimages that separately enhance skin components (pores, radiance, vessels...). From these eigenimages, we extract morphological texture descriptors and intent a time prediction. Accuracy of common reduced spaces outperform classical PCA one. In this paper, we detail how PCA is extended to the multiple groups case and explain what are the advantages of common reduced spaces when it comes to study several multispectral images.

  16. IMPORTANCE OF APPLYING DATA ENVELOPMENT ANALYSIS IN CASE OF HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Labas Istvan

    2015-07-01

    Full Text Available Today, the saying predominates better and better according to which a strong target rationalism has to characterize the higher educational institutions due to the scarce resources and the limitlessness of user demands. Now in Hungary, the funding of higher educational system goes through a total transformation thus the leadership has continuously to reckon with the changes of environment and, in tune with these ones, has to modify the goals existing already. Nowadays, it becomes more and more important to measure the effectiveness of the organizations – organizational units pursuing the same or similar activities relative to each other. Benchmarking helps this procedure. Benchmarking is none other than such a tool of analysis and planning which allows comparing the organizations with the best of the competitors. Applying the method with regard to the higher educational institutions is really nothing but a procedure which focuses on comparing processes and results of the institutions’ different functional areas in order to bring to light the opportunities for the rationality as well as the quality and performance improvement. Those benefits could be managed and used as breakthrough possibilities which have been already developed/applied by other organizations and are given by the way leading to a more effective management.The main goal of my monograph is to show a kind of application of Data Envelopment Analysis (DEA method in the higher education. DEA itself is a performance measuring methodology which is a part of benchmarking and uses the linear programming as a method. By means of its application, the effectiveness of different decision-making units can be compared numerically. In our forcefully varying environment, the managerial decision making can be largely supported in each case by such information that is numerically able to identify which organizational units and activities are effective or less effective. Its advantage is that

  17. Use of geostatistics on broiler production for evaluation of different minimum ventilation systems during brooding phase

    Directory of Open Access Journals (Sweden)

    Thayla Morandi Ridolfi de Carvalho

    2012-01-01

    Full Text Available The objective of this research was to evaluate different minimum ventilation systems, in relation to air quality and thermal comfort using geostatistics in brooding phase. The minimum ventilation systems were: Blue House I: exhaust fans + curtain management (end of the building; Blue House II: exhaust fans + side curtain management; and Dark House: exhaust fans + flag. The climate variables evaluated were: dry bulb temperature, relative humidity, air velocity, carbon dioxide and ammonia concentration, during winter time, at 9 a.m., in 80 equidistant points in brooding area. Data were evaluated by geostatistic technique. The results indicate that Wider broiler houses (above 15.0 m width present the greatest ammonia and humidity concentration. Blue House II present the best results in relation to air quality. However, none of the studied broiler houses present an ideal thermal comfort.

  18. Methodology and applications in non-linear model-based geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data that are approximat......Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data....... Conditioned by an underlying and unobserved Gaussian process the observations at the measured locations follow a generalised linear model. Concerning inference Markov chain Monte Carlo methods are used. The study of these models is the main topic of the thesis. Construction of priors, and the use of flat...

  19. Applied Electromagnetics

    International Nuclear Information System (INIS)

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  20. Evaluation of statistical and geostatistical models of digital soil properties mapping in tropical mountain regions

    OpenAIRE

    Waldir de Carvalho Junior; Cesar da Silva Chagas; Philippe Lagacherie; Braz Calderano Filho; Silvio Barge Bhering

    2014-01-01

    Soil properties have an enormous impact on economic and environmental aspects of agricultural production. Quantitative relationships between soil properties and the factors that influence their variability are the basis of digital soil mapping. The predictive models of soil properties evaluated in this work are statistical (multiple linear regression-MLR) and geostatistical (ordinary kriging and co-kriging). The study was conducted in the municipality of Bom Jardim, RJ, using a soil database ...

  1. The Direct Sampling method to perform multiple-point geostatistical simulations

    OpenAIRE

    Mariethoz, Grégoire; Renard, Philippe; Straubhaar, Julien

    2011-01-01

    Multiple-point geostatistics is a general statistical framework to model spatial fields displaying a wide range of complex structures. In particular, it allows controlling connectivity patterns that have a critical importance for groundwater flow and transport problems. This approach involves considering data events (spatial arrangements of values) derived from a training image (TI). All data events found in the TI are usually stored in a database, which is used to retrieve conditional probab...

  2. Geostatistical Prediction of Ocean Outfall Plume Characteristics Based on an Autonomous Underwater Vehicle

    OpenAIRE

    Patrícia Alexandra Gregório Ramos

    2013-01-01

    Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfal...

  3. Global Appearance Applied to Visual Map Building and Path Estimation Using Multiscale Analysis

    Directory of Open Access Journals (Sweden)

    Francisco Amorós

    2014-01-01

    Full Text Available In this work we present a topological map building and localization system for mobile robots based on global appearance of visual information. We include a comparison and analysis of global-appearance techniques applied to wide-angle scenes in retrieval tasks. Next, we define multiscale analysis, which permits improving the association between images and extracting topological distances. Then, a topological map-building algorithm is proposed. At first, the algorithm has information only of some isolated positions of the navigation area in the form of nodes. Each node is composed of a collection of images that covers the complete field of view from a certain position. The algorithm solves the node retrieval and estimates their spatial arrangement. With these aims, it uses the visual information captured along some routes that cover the navigation area. As a result, the algorithm builds a graph that reflects the distribution and adjacency relations between nodes (map. After the map building, we also propose a route path estimation system. This algorithm takes advantage of the multiscale analysis. The accuracy in the pose estimation is not reduced to the nodes locations but also to intermediate positions between them. The algorithms have been tested using two different databases captured in real indoor environments under dynamic conditions.

  4. Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.

    Science.gov (United States)

    de Viguerie, Laurence; Sole, V Armando; Walter, Philippe

    2009-12-01

    X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.

  5. A geostatistical methodology to assess the accuracy of unsaturated flow models

    International Nuclear Information System (INIS)

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error

  6. A space and time scale-dependent nonlinear geostatistical approach for downscaling daily precipitation and temperature

    KAUST Repository

    Jha, Sanjeev Kumar

    2015-07-21

    A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here, the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 km and 10 km resolution for a twenty year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference dataset indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local scale estimates of precipitation and temperature from General Circulation Models. This article is protected by copyright. All rights reserved.

  7. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  8. A novel geotechnical/geostatistical approach for exploration and production of natural gas from multiple geologic strata, Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Overbey, W.K. Jr.; Reeves, T.K.; Salamy, S.P.; Locke, C.D.; Johnson, H.R.; Brunk, R.; Hawkins, L. (BDM Engineering Services Co., Morgantown, WV (United States))

    1991-05-01

    This research program has been designed to develop and verify a unique geostatistical approach for finding natural gas resources. The research has been conducted by Beckley College, Inc. (Beckley) and BDM Engineering Services Company (BDMESC) under contract to the US Department of Energy (DOE), Morgantown Energy Technology Center. Phase 1 of the project consisted of compiling and analyzing relevant geological and gas production information in selected areas of Raleigh County, West Virginia, ultimately narrowed to the Eccles, West Virginia, 7 {1/2} minute Quadrangle. The Phase 1 analysis identified key parameters contributing to the accumulation and production of natural gas in Raleigh County, developed analog models relating geological factors to gas production, and identified specific sites to test and verify the analysis methodologies by drilling. Based on the Phase 1 analysis, five sites have been identified with high potential for economic gas production. Phase 2 will consist of drilling, completing, and producing one or more wells at the sites identified in the Phase 1 analyses. The initial well is schedules to the drilled in April 1991. This report summarizes the results of the Phase 1 investigations. For clarity, the report has been prepared in two volumes. Volume 1 presents the Phase 1 overview; Volume 2 contains the detailed geological and production information collected and analyzed for this study.

  9. [Risk Analysis applied to food safety in Brazil: prospects and challenges].

    Science.gov (United States)

    Figueiredo, Ana Virgínia de Almeida; Miranda, Maria Spínola

    2011-04-01

    The scope of this case study is to discuss the ideas of the Brazilian Codex Alimentarius Committee (CCAB) coordinated by National Institute of Metrology, Standardization and Industrial Quality (Inmetro), with respect to the Codex Alimentarius norm on Risk Analysis (RA) applied to Food Safety. The objectives of this investigation were to identify and analyze the opinion of CCAB members on RA and to register their proposals for the application of this norm in Brazil, highlighting the local limitations and potential detected. CCAB members were found to be in favor of the Codex Alimentarius initiative of instituting an RA norm to promote the health safety of foods that circulate on the international market. There was a consensus that the Brazilian government should incorporate RA as official policy to improve the country's system of food control and leverage Brazilian food exports. They acknowledge that Brazil has the technical-scientific capacity to apply this norm, though they stressed several political and institutional limitations. The members consider RA to be a valid initiative for tackling risks in food, due to its ability to improve food safety control measures adopted by the government.

  10. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    Science.gov (United States)

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  11. Raman spectroscopy and capillary electrophoresis applied to forensic colour inkjet printer inks analysis.

    Science.gov (United States)

    Król, Małgorzata; Karoly, Agnes; Kościelniak, Paweł

    2014-09-01

    Forensic laboratories are increasingly engaged in the examination of fraudulent documents, and what is important, in many cases these are inkjet-printed documents. That is why systematic approaches to inkjet printer inks comparison and identification have been carried out by both non-destructive and destructive methods. In this study, micro-Raman spectroscopy and capillary electrophoresis (CE) were applied to the analysis of colour inkjet printer inks. Micro-Raman spectroscopy was used to study the chemical composition of colour inks in situ on a paper surface. It helps to characterize and differentiate inkjet inks, and can be used to create a spectra database of inks taken from different cartridge brands and cartridge numbers. Capillary electrophoresis in micellar electrophoretic capillary chromatography mode was applied to separate colour and colourless components of inks, enabling group identification of those components which occur in a sufficient concentration (giving intensive peaks). Finally, on the basis of the obtained results, differentiation of the analysed inks was performed. Twenty-three samples of inkjet printer inks were examined and the discriminating power (DP) values for both presented methods were established in the routine work of experts during the result interpretation step. DP was found to be 94.0% (Raman) and 95.6% (CE) when all the analysed ink samples were taken into account, and it was 96.7% (Raman) and 98.4% (CE), when only cartridges with different index numbers were considered.

  12. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    Science.gov (United States)

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients.

  13. A Genre Analysis of English and Iranian Research Articles Abstracts in Applied Linguistics and Mathematics

    Directory of Open Access Journals (Sweden)

    Biook Behnam

    2014-09-01

    Full Text Available In recent years, genre studies have attracted the attention of many researchers. The aim of the present study was to observe the differences in generic structure of abstract written by English native and non-native (Iranian students in two disciplines of mathematics and applied linguistics. To this end, twenty native English students’ abstract texts from each discipline and the same number of non-native (Iranian ones were selected. In this study, Hyland’s (2000 five‐move model was used to identify the rhetorical structure of the four sets of texts. After analyzing each text, the main moves were extracted and the frequencies of each one were calculated and compared. The cross-disciplinary and cross‐linguistic analyses reveal that linguistics abstracts follow a conventional scheme, but mathematics abstracts in these two languages do not exhibit the usual norms in terms of moves. Besides, greater difference in move structure is seen across languages in mathematics. The findings of the study have some pedagogical implications for academic writing courses for graduate students, especially students from non-English backgrounds in order to facilitate their successful acculturation into these disciplinary communities.Keywords: Genre Analysis, mathematics, applied linguistics

  14. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO2) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  15. The Process of Laying Concrete and Analysis of Operations Applying the Lean Approach

    Directory of Open Access Journals (Sweden)

    Vidmantas Gramauskas

    2012-11-01

    Full Text Available The paper considers Lean philosophy ‘Just in Time’, a value stream map and total quality management principles applying them to the construction. In order to follow these principles, a case study was performed, thus observing and recording the process of laying concrete in three houses where a lower ground floor was casted employing fiber concrete. The collected data were required to fragment the process of laying concrete into smaller operations and examine each operation independently. The examination of operations was introduced in certain units of measurement – time, the number of workers, cubic meters of concrete used, space area, etc. This information helped with distinguishing useful operations from useless actions bringing no value to the product. The previously mentioned methods can be applied to useless operations to reduce their duration or even eliminate them. The main problem is the process of laying concrete splitting it into smaller operations, operation analysis and adaptation of Lean principles. The implementation of Lean system can reduce waste and increase the value of the final product.

  16. CONTROL AND STABILITY ANALYSIS OF THE GMC ALGORITHM APPLIED TO pH SYSTEMS

    Directory of Open Access Journals (Sweden)

    Manzi J.T.

    1998-01-01

    Full Text Available This paper deals with the control of the neutralization processes of the strong acid-strong base and the weak acid-strong base systems using the Generic Model Control (GMC algorithm. The control strategy is applied to a pilot plant where hydrochloric acid-sodium hydroxide and acetic acid-sodium hydroxide systems are neutralized. The GMC algorithm includes in the controller structure a nonlinear model of the process in the controller structure. The paper also focuses the provides a stability analysis of the controller for some of the uncertainties involved in the system. The rResults indicate that the controller stabilizes the system for a large range of uncertainties, but the performance may deteriorate when the system is submitted to large disturbances.

  17. Methods of economic analysis applied to fusion research. Fourth annual report

    International Nuclear Information System (INIS)

    The current study reported here has involved three separate tasks. The first task deals with the development of expected utility analysis techniques for economic evaluation of fusion research. A decision analytic model is developed for the incorporation of market uncertainties, as well as technological uncertainties in an economic evaluation of long-range energy research. The model is applied to the case of fusion research. The second task deals with the potential effects of long-range energy RD and D on fossil fuel prices. ECON's previous fossil fuel price model is extended to incorporate a dynamic demand function. The dynamic demand function supports price fluctuations such as those observed in the marketplace. The third task examines alternative uses of fusion technologies, specifically superconducting technologies and first wall materials to determine the potential for alternative, nonfusion use of these technologies. In both cases, numerous alternative uses are found

  18. Analysis Applied Multivariate to the Studies of Stability in the Reactors BWR

    International Nuclear Information System (INIS)

    Presently work is presented the application of the analysis multivariate in the studies of stability of reactors BWR. For the confirmation of the applicability of the method of Hilbert Huang is used a group of series acquired neutronic during an outburst in the power station nuclear of Cofrentes. The peculiarity of the analyzed data is that they are not stationary and contaminated by the performance of other systems of the plant, for that that when applying the methods traditional autoregressive to these data, is values non realists of the DR In the work the DR is compared obtained by the methodology presented with the true DR and with the one obtained starting from the application of methods autoregressive to the original sign. The conclusion is evident, the value of the DR obtained by the methodology explained in this work is next to the one True DR that the resulting DR of the application of the method AR to the original sign

  19. Analysis of Robustness for Convex Optimization Applied to Array Antenna Pattern Synthesis

    Directory of Open Access Journals (Sweden)

    R. Torrealba

    2008-01-01

    Full Text Available This study presents an analysis of the convex optimization applied to the synthesis of the radiation pattern for linear antenna arrays. This study emphasizes the application of the convex optimization for the array pattern synthesis considering the simultaneous elimination of several zones interferences, reduction of the level of power in two space zones densely populated by interferences, as well as the variation of these zones in terms of proximity-distance of the source of interest, variation of the size of the interferences zones and the number of zones within the radiation pattern. Simulation results are provided. These results define certain levels where the linear array could be exploited to achieve a maximum performance.

  20. Wavelets, Curvelets and Multiresolution Analysis Techniques Applied to Implosion Symmetry Characterization of ICF Targets

    CERN Document Server

    Afeyan, Bedros; Starck, Jean Luc; Cuneo, Michael

    2012-01-01

    We introduce wavelets, curvelets and multiresolution analysis techniques to assess the symmetry of X ray driven imploding shells in ICF targets. After denoising X ray backlighting produced images, we determine the Shell Thickness Averaged Radius (STAR) of maximum density, r*(N, {\\theta}), where N is the percentage of the shell thickness over which to average. The non-uniformities of r*(N, {\\theta}) are quantified by a Legendre polynomial decomposition in angle, {\\theta}. Undecimated wavelet decompositions outperform decimated ones in denoising and both are surpassed by the curvelet transform. In each case, hard thresholding based on noise modeling is used. We have also applied combined wavelet and curvelet filter techniques with variational minimization as a way to select the significant coefficients. Gains are minimal over curvelets alone in the images we have analyzed.

  1. Applying high resolution SyXRD analysis on sulfate attacked concrete field samples

    Energy Technology Data Exchange (ETDEWEB)

    Stroh, J. [BAM Federal Institute for Materials Research and Testing, Richard-Willstätter-Straße 11, 12489 Berlin (Germany); Schlegel, M.-C. [BAM Federal Institute for Materials Research and Testing, Richard-Willstätter-Straße 11, 12489 Berlin (Germany); Helmholtz-Zentrum Berlin (HZB), Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Irassar, E.F. [Department of Civil Engineering, National University of Buenos Aires Center State, Avenida Del Valle 5737, B7400JWI Olavarría (Argentina); Meng, B. [BAM Federal Institute for Materials Research and Testing, Richard-Willstätter-Straße 11, 12489 Berlin (Germany); Emmerling, F., E-mail: franziska.emmerling@bam.de [BAM Federal Institute for Materials Research and Testing, Richard-Willstätter-Straße 11, 12489 Berlin (Germany)

    2014-12-15

    High resolution synchrotron X-ray diffraction (SyXRD) was applied for a microstructural profile analysis of concrete deterioration after sulfate attack. The cement matrices consist of ordinary Portland cement and different amounts of supplementary cementitious materials, such as fly ash, natural pozzolana and granulated blast furnace slag. The changes of the phase composition were determined along the direction of sulfate ingress. This approach allows the identification of reaction fronts and zones of different phase compositions and conclusions about the mechanisms of sulfate attack. Two reaction fronts were localized in the initial 4 mm from the sample surface. The mechanism of deterioration caused by the exposition in the sulfate-bearing soil is discussed. SyXRD is shown to be a reliable method for investigation of cementitious materials with aggregates embedded in natural environments.

  2. Color changes in wood during heating: kinetic analysis by applying a time-temperature superposition method

    Science.gov (United States)

    Matsuo, Miyuki; Yokoyama, Misao; Umemura, Kenji; Gril, Joseph; Yano, Ken'ichiro; Kawai, Shuichi

    2010-04-01

    This paper deals with the kinetics of the color properties of hinoki ( Chamaecyparis obtusa Endl.) wood. Specimens cut from the wood were heated at 90-180°C as accelerated aging treatment. The specimens completely dried and heated in the presence of oxygen allowed us to evaluate the effects of thermal oxidation on wood color change. Color properties measured by a spectrophotometer showed similar behavior irrespective of the treatment temperature with each time scale. Kinetic analysis using the time-temperature superposition principle, which uses the whole data set, was successfully applied to the color changes. The calculated values of the apparent activation energy in terms of L *, a *, b *, and Δ E^{*}_{ab} were 117, 95, 114, and 113 kJ/mol, respectively, which are similar to the values of the literature obtained for other properties such as the physical and mechanical properties of wood.

  3. Econometrics analysis of consumer behaviour: a linear expenditure system applied to energy

    International Nuclear Information System (INIS)

    In economics literature the expenditure system specification is a well known subject. The problem is to define a coherent representation of consumer behaviour through functional forms easy to calculate. In this work it is used the Stone-Geary Linear Expenditure System and its multi-level decision process version. The Linear Expenditure system is characterized by an easy calculating estimation procedure, and its multi-level specification allows substitution and complementary relations between goods. Moreover, the utility function separability condition on which the Utility Tree Approach is based, justifies to use an estimation procedure in two or more steps. This allows to use an high degree of expenditure categories disaggregation, impossible to reach the Linear Expediture System. The analysis is applied to energy sectors

  4. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    Science.gov (United States)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the

  5. Challenges in the implementation of a quality management system applied to radiometric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Danila C.S.; Bonifacio, Rodrigo L.; Nascimento, Marcos R.L.; Silva, Nivaldo C. da; Taddei, Maria Helena T., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN-MG), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2015-07-01

    The concept of quality in laboratories has been well established as an essential factor in the search for reliable results. Since its first version published (1999), the ISO/IEC 17025 has been applied in the industrial and research fields, in a wide range of laboratorial analyses. However, the implementation of a Quality Management System still poses great challenges to institutions and companies. The purpose of this work is to expose the constraints related to the implementation of ISO/IEC 17025 applied to analytical assays of radionuclides, accomplished by studying the case of the Pocos de Caldas Laboratory of the Brazilian Commission for Nuclear Energy. In this lab, a project of accreditation of techniques involving determination of radionuclides in water, soil, sediment and food samples has been conducted since 2011. The challenges presented by this project arise from the administrative view, where the governmental nature of the institution translates into unlevelled availability resources and the organizational view, whereas QMS requires inevitable changes in the organizational culture. It is important to point out that when it comes to accreditation of analysis involving radioactive elements, many aspects must be treated carefully due to the their very particular nature. Among these concerns are the determination of analysis uncertainties, accessibility to international proficiency studies, international radioactive samples and CRM transportation, the study of parameters on the validation of analytical methods and the lack of documentation and specialized personnel regarding quality at radiometric measurements. Through an effective management system, the institution is overcoming these challenges, moving toward the ISO/IEC 17025 accreditation. (author)

  6. Optical Image Analysis Applied to Pore Network Quantification of Sandstones Under Experimental CO2 Injection

    Science.gov (United States)

    Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.

    2015-12-01

    This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical

  7. Geomechanical analysis applied to geological carbon dioxide sequestration, induced seismicity in deep mines, and detection of stress-induced velocity anisotropy in sub-salt environments

    Science.gov (United States)

    Lucier, Amie Marie

    The role of geomechanical analysis in characterizing the feasibility of CO2 sequestration in deep saline aquifers is addressed in two investigations. The first investigation was completed as part of the Ohio River Valley CO2 Storage Project. We completed a geomechanical analysis of the Rose Run Sandstone, a potential injection zone, and its adjacent formations at the American Electric Power's 1.3 GW Mountaineer Power Plant in New Haven, West Virginia. The results of this analysis were then used to evaluate the feasibility of anthropogenic CO2 sequestration in the potential injection zone. First, we incorporated the results of the geomechanical analysis with a geostatistical aquifer model in CO2 injection flow simulations to test the effects of introducing a hydraulic fracture to increase injectivity. Then, we determined that horizontal injection wells at the Mountaineer site are feasible because the high rock strength ensures that such wells would be stable in the local stress state. Finally, we evaluated the potential for injection-induced seismicity. The second investigation concerning CO2 sequestration was motivated by the modeling and fluid flow simulation results from the first study. The geomechanics-based assessment workflow follows a bottom-up approach for evaluating regional deep saline aquifer CO2 injection and storage feasibility. The CO2 storage capacity of an aquifer is a function of its porous volume as well as its CO2 injectivity. For a saline aquifer to be considered feasible in this assessment it must be able to store a specified amount of CO2 at a reasonable cost per ton of CO 2. The proposed assessment workflow has seven steps. The workflow was applied to a case study of the Rose Run sandstone in the eastern Ohio River Valley. We found that it is feasible in this region to inject and store 113 Mt CO2/yr for 30 years at an associated well cost of less than 1.31 US$/t CO2, but only if injectivity enhancement techniques such as hydraulic fracturing

  8. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    Directory of Open Access Journals (Sweden)

    Alexandra Ziemann

    Full Text Available Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors.We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events.We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness.We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  9. Applying Multiscale Entropy to the Complexity Analysis of Rainfall-Runoff Relationships

    Directory of Open Access Journals (Sweden)

    Chien-Ming Chou

    2012-05-01

    Full Text Available This paper presents a novel framework for the complexity analysis of rainfall, runoff, and runoff coefficient (RC time series using multiscale entropy (MSE. The MSE analysis of RC time series was used to investigate changes in the complexity of rainfall-runoff processes due to human activities. Firstly, a coarse graining process was applied to a time series. The sample entropy was then computed for each coarse-grained time series, and plotted as a function of the scale factor. The proposed method was tested in a case study of daily rainfall and runoff data for the upstream Wu–Tu watershed. Results show that the entropy measures of rainfall time series are higher than those of runoff time series at all scale factors. The entropy measures of the RC time series are between the entropy measures of the rainfall and runoff time series at various scale factors. Results also show that the entropy values of rainfall, runoff, and RC time series increase as scale factors increase. The changes in the complexity of RC time series indicate the changes of rainfall-runoff relations due to human activities and provide a reference for the selection of rainfall-runoff models that are capable of dealing with great complexity and take into account of obvious self-similarity can be suggested to the modeling of rainfall-runoff processes. Moreover, the robustness of the MSE results were tested to confirm that MSE analysis is consistent and the same results when removing 25% data, making this approach suitable for the complexity analysis of rainfall, runoff, and RC time series.

  10. Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation

    Science.gov (United States)

    Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.

    2012-12-01

    The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T cloud formation. Initial results applying single particle IN analysis using CCSEM/EDX and STXM/NEXAFS reveal that a significant amount of IN are coated by organics and, thus, are similar to the

  11. The geostatistics of the metal concentrations in sediments from the eastern Brazilian continental shelf in areas of gas and oil production

    Science.gov (United States)

    Aguiar, Jose Edvar; de Lacerda, Luiz Drude; Miguens, Flavio Costa; Marins, Rozane Valente

    2014-04-01

    Geostatistical techniques were used to evaluate the differences in the geochemistry of metals in the marine sediments along the Eastern Brazilian continental margin along the states of Ceará and Rio Grande do Norte (Northeastern sector) and Espírito Santo (Southeastern sector). The concentrations of Al, Fe, Mn, Ba, Cd, Cu, Cr, Ni, Pb, V, Hg, and Zn were obtained from acid digestion and quantified using flame atomic absorption spectrometry (AAS), inductively coupled plasma mass spectrometry (ICP-MS) and inductively coupled plasma atomic emission spectrometry (ICP-AES). The metals showed a similar order of concentration: Al > Fe > Ba > Mn > V > Ni > Pb > Cr > Zn > Cu, in both the Ceará; and Rio Grande do Norte shelf regions but different in the Espírito Santo shelf (Fe > Al > Mn > Ba > Zn > V > Cr > Ni > Pb > Cu. The concentrations of Hg and Cd were below the detection limit in all areas. A multivariate analysis revealed that the metals of siliciclastic origin on the continental shelf of Ceará are carried by Al. In addition, a large portion of metal deposits is connected to the iron and manganese oxides on the continental margin of Rio Grande do Norte. The metals from the continental supply on the coast of Espírito Santo (Cu, Ni, Ba, and Mn) are associated with Al; whereas Cr, Pb, V, and Zn are associated with iron in this southern area. Geochemical evaluations are needed to distinguish the origin and mineralogical differences of marine sediments within the regions. Scanning electron microscopy/energy dispersive spectrometry (SEM/EDS) applied to the sediments from the coast of Ceará showed the morphological diversity of sediment grains: biological fragments, multifaceted particles, aggregates, and crystals occurred in the three regions analyzed. Among these grains, calcite, Mg-calcite, and aragonite were predominant in the northeastern sector, whereas silicates and other minerals were predominant the southeastern sector. Mg, K, Ti, and Zr as well as the

  12. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Science.gov (United States)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  13. ANALYSIS OF A WEB INFORMATION SYSTEM APPLIED MANAGEMENT SCHOOL OF COMPUTING

    Directory of Open Access Journals (Sweden)

    ROGER CRISTHIAN GOMES

    2010-01-01

    Full Text Available One of the tasks of an entrepreneur is choose a computerized information system for the management of your business, regardless of their size and field of expertise. Having to determine if the information system will be modeling for local use, also known as standalone, or developed for the web, is becoming increasingly common, as the Internet, with its characteristics, greatly facilitates the work of the manager. However, can not simply deduct or take into account only the technological trends and market to resolve an issue that will require in the form of operation, administration and management. To choose between one or another type of system is necessary to examine the advantages and disadvantages of each model in relation to the business in question. This study aimed to list the main features intrinsic to web and stand-alone applications. The study of these two types of applications was based on analysis of an information system applied to a company to provide services in computer training. For the analysis of the information system were carried out a survey of the main requirements and modeling of a prototype. It was proposed to develop the system in a web environment, using the JAVA platform with the database manager MySQL, because the tools are complete, well documented, free, and with features that help to ensure the functionality and quality of the information system web.

  14. Bayesian flux balance analysis applied to a skeletal muscle metabolic model.

    Science.gov (United States)

    Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki

    2007-09-01

    In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models.

  15. Applying a resources framework to analysis of the Force and Motion Conceptual Evaluation

    Science.gov (United States)

    Smith, Trevor I.; Wittmann, Michael C.

    2008-12-01

    We suggest one redefinition of common clusters of questions used to analyze student responses on the Force and Motion Conceptual Evaluation. Our goal is to propose a methodology that moves beyond an analysis of student learning defined by correct responses, either on the overall test or on clusters of questions defined solely by content. We use the resources framework theory of learning to define clusters within this experimental test that was designed without the resources framework in mind. We take special note of the contextual and representational dependence of questions with seemingly similar physics content. We analyze clusters in ways that allow the most common incorrect answers to give as much, or more, information as the correctness of responses in that cluster. We show that false positives can be found, especially on questions dealing with Newton’s third law. We apply our clustering to a small set of data to illustrate the value of comparing students’ incorrect responses which are otherwise identical on a correct or incorrect analysis. Our work provides a connection between theory and experiment in the area of survey design and the resources framework.

  16. Analysis of Phoenix Anomalies and IV & V Findings Applied to the GRAIL Mission

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    NASA IV&V was established in 1993 to improve safety and cost-effectiveness of mission critical software. Since its inception the tools and strategies employed by IV&V have evolved. This paper examines how lessons learned from the Phoenix project were developed and applied to the GRAIL project. Shortly after selection, the GRAIL project initiated a review of the issues documented by IV&V for Phoenix. The motivation was twofold: the learn as much as possible about the types of issues that arose from the flight software product line slated for use on GRAIL, and to identify opportunities for improving the effectiveness of IV&V on GRAIL. The IV&V Facility provided a database dump containing 893 issues. These were categorized into 16 bins, and then analyzed according to whether the project responded by changing the affected artifacts or using as-is. The results of this analysis were compared to a similar assessment of post-launch anomalies documented by the project. Results of the analysis were discussed with the IV&V team assigned to GRAIL. These discussions led to changes in the way both the project and IV&V approached the IV&V task, and improved the efficiency of the activity.

  17. Mapping Oxygen-18 in Meteoric Precipitation over Peninsular Spain using Geostatistical Tools

    Science.gov (United States)

    Capilla, J. E.; Rodriguez Arevalo, J.; Castaño Castaño, S.; Diaz Teijeiro, M.; Heredia Diaz, J.; Sanchez del Moral, R.

    2011-12-01

    Rainfall isotopic composition is a valuable source of information to understand and model the complexity of natural hydrological systems. The identification of water recharge origins, water flow trajectories, residence times and pollutants movement in the hydrologic cycle can greatly benefit from this information. It is also very useful in other environmental issues associated to water movement in the biosphere. Although the potential of stable isotopes data in hydrology and climatic studies is promising, so far, it has been strongly limited by the availability of data. A major challenge is to extend sparse measurements of stable isotopes data to surrounding geographic areas taking into account other secondary variables as latitude, altitude and climate related parameters. Current state-of-the-art provides different approaches mainly based in deterministic interpolation techniques. In Spain a network called REVIP, made up by 16 nodes, is maintained by CEDEX (Centro de Estudios y Experimentación de Obras públicas). At REVIP nodes, stable isotopes in meteoric precipitation (Oxygen-18 and Deuterium) have been continuously recorded since the year 2000. This network provides a rare opportunity to study the patterns of spatial distribution over the whole country. In fact, some accurate regression models have already been proposed that map stable isotopes against latitude and altitude. Yet, these regressions maintain small residuals at the network nodes that are possibly caused by the local average features of climatic events. There is an ongoing effort to improve these maps that includes the identification of relevant climatic parameters and the application of geostatistical techniques. This paper describes the application of a regression kriging methodology to map oxygen-18 over peninsular Spain using REVIP data. The methodology includes a prior process to obtain normalized stable isotope concentrations that are independent of latitude and altitude, a structural

  18. Design and analysis of environmental monitoring programs

    DEFF Research Database (Denmark)

    Lophaven, Søren Nymand

    2005-01-01

    can handle missing data values and utilize the spatial and temporal correlation in data. Modelling results can be used to improve reporting on the state of the marine environment in the Kattegat. The thesis also focus on design of monitoring networks, from which geostatistics can be successfully...... applied. Existing design methods are reviewed, and based on these a new Bayesian geostatistical design approach is suggested. This focus on constructing monitoring networks which are efficient for computing spatial predictions, while taking the uncertainties of the parameters in the geostatistical model...... into account. Thus, it serves as a compromise between existing methods. The space-time model approaches and geostatistical design methods used in this thesis are generally applicable, i.e. with minor modifications they could equally well be applied within areas such as soil and air pollution. In Danish: Denne...

  19. Non-Linear Non Stationary Analysis of Two-Dimensional Time-Series Applied to GRACE Data Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovative two-dimensional (2D) empirical mode decomposition (EMD) analysis was applied to NASA's Gravity Recovery and Climate Experiment (GRACE)...

  20. Quadratic Time-Frequency Analysis of Hydroacoustic Signals as Applied to Acoustic Emissions of Large Whales

    Science.gov (United States)

    Le Bras, Ronan; Victor, Sucic; Damir, Malnar; Götz, Bokelmann

    2014-05-01

    In order to enrich the set of attributes in setting up a large database of whale signals, as envisioned in the Baleakanta project, we investigate methods of time-frequency analysis. The purpose of establishing the database is to increase and refine knowledge of the emitted signal and of its propagation characteristics, leading to a better understanding of the animal migrations in a non-invasive manner and to characterize acoustic propagation in oceanic media. The higher resolution for signal extraction and a better separation from other signals and noise will be used for various purposes, including improved signal detection and individual animal identification. The quadratic class of time-frequency distributions (TFDs) is the most popular set of time-frequency tools for analysis and processing of non-stationary signals. Two best known and most studied members of this class are the spectrogram and the Wigner-Ville distribution. However, to be used efficiently, i.e. to have highly concentrated signal components while significantly suppressing interference and noise simultaneously, TFDs need to be optimized first. The optimization method used in this paper is based on the Cross-Wigner-Ville distribution, and unlike similar approaches it does not require prior information on the analysed signal. The method is applied to whale signals, which, just like the majority of other real-life signals, can generally be classified as multicomponent non-stationary signals, and hence time-frequency techniques are a natural choice for their representation, analysis, and processing. We present processed data from a set containing hundreds of individual calls. The TFD optimization method results into a high resolution time-frequency representation of the signals. It allows for a simple extraction of signal components from the TFD's dominant ridges. The local peaks of those ridges can then be used for the signal components instantaneous frequency estimation, which in turn can be used as

  1. A variable age of onset segregation model for linkage analysis, with correction for ascertainment, applied to glioma

    DEFF Research Database (Denmark)

    Sun, Xiangqing; Vengoechea, Jaime; Elston, Robert;

    2012-01-01

    We propose a 2-step model-based approach, with correction for ascertainment, to linkage analysis of a binary trait with variable age of onset and apply it to a set of multiplex pedigrees segregating for adult glioma.......We propose a 2-step model-based approach, with correction for ascertainment, to linkage analysis of a binary trait with variable age of onset and apply it to a set of multiplex pedigrees segregating for adult glioma....

  2. Applying independent component analysis to detect silent speech in magnetic resonance imaging signals.

    Science.gov (United States)

    Abe, Kazuhiro; Takahashi, Toshimitsu; Takikawa, Yoriko; Arai, Hajime; Kitazawa, Shigeru

    2011-10-01

    Independent component analysis (ICA) can be usefully applied to functional imaging studies to evaluate the spatial extent and temporal profile of task-related brain activity. It requires no a priori assumptions about the anatomical areas that are activated or the temporal profile of the activity. We applied spatial ICA to detect a voluntary but hidden response of silent speech. To validate the method against a standard model-based approach, we used the silent speech of a tongue twister as a 'Yes' response to single questions that were delivered at given times. In the first task, we attempted to estimate one number that was chosen by a participant from 10 possibilities. In the second task, we increased the possibilities to 1000. In both tasks, spatial ICA was as effective as the model-based method for determining the number in the subject's mind (80-90% correct per digit), but spatial ICA outperformed the model-based method in terms of time, especially in the 1000-possibility task. In the model-based method, calculation time increased by 30-fold, to 15 h, because of the necessity of testing 1000 possibilities. In contrast, the calculation time for spatial ICA remained as short as 30 min. In addition, spatial ICA detected an unexpected response that occurred by mistake. This advantage was validated in a third task, with 13 500 possibilities, in which participants had the freedom to choose when to make one of four responses. We conclude that spatial ICA is effective for detecting the onset of silent speech, especially when it occurs unexpectedly.

  3. Neutron Activation Analysis and High Resolution Gamma-Ray Spectrometry Applied to Areal Elemental Distribution Studies

    International Nuclear Information System (INIS)

    Schuiling (1967) applied both 'metallogenetic province' and continental drift principles to a study of the world-wide distribution of tin. A plot of tin deposit occurrences on the continents reconstituted as 'Pangeae' yielded 'tin belts' joining intercontinentally between the Americas, Africa and Europe. Discussions with Sir John Cockcroft and Sir Edward Bullard, in April 1967, led to this study of the applicability of automated, instrumental thermal neutron activation analysis techniques to large-scale areal elemental distribution determinations related to continental drift and to metallogenesis. The Enchanted Rock batholith, Llano, Texas, was selected as an initial area in which to apply this method on the basis of the availability of independent geochemical information concerning the pluton from Hutchinson (1956), Billings (1963) and Ragland (1968). Rock samples, including points from areas outside the batholith, were obtained at each of 16 sampling sites. One-gram rock samples were irradiated in a thermal neutron flux of ≈2 x 1012 n/cm2 s for 2 hours. Six trace elements (Hf, Ta, Co, Eu, Sc and La), and one minor element (Fe), were determined by gamma-ray spectrometry utilizing a 19 cm3 Ge(Li) detector and a 3200-channel analyser, and were areally mapped. The results indicate continuous trends in each trace element, through various rock types, over a distance of greater than 50 miles. The trace elements of pyrite, chalcopynte and sphalerite obtained from the Philippine Islands were measured in order to apply this procedure to minerals in a location where their areal extent has not previously been extensively studied. The methodology described above was repeated. A set of average element abundances in chalcopynte, pyrite and sphalerite is suggested on which to base the presence or absence of an element province or combined elements provinces. Preliminary results indicate the presence of a gold province in the northwestern part of Luzon Island. This technique

  4. Spatial Prediction of Soil Aggregate Stability and Aggregate-Associated Organic Carbon Content at the Catchment Scale Using Geostatistical Techniques

    Institute of Scientific and Technical Information of China (English)

    J.MOHAMMADI; M.H.MOTAGHIAN

    2011-01-01

    The association of organic carbon with secondary parzicles (aggregates) results in its storage and retention in soil. A study was carried out at a catchment covering about 92 km2 to predict spatial variability of soil water-stable aggregates (WSA), mean weight diameter (MWD) of aggregates and organic carbon (OC) content in macro- (> 2 mm), meso- (1-2 mm), and micro-aggregate (< 1 mm) fractions, using geostatistical methods. One hundred and eleven soil samples were c(o)llected at the 0-10 cm depth and fractionated into macro-, meso-, and micro-aggregates by wet sieving. The OC content was determined for each fraction. A greater percentage of water-stable aggregates was found for micro-aggregates, followed by meso-aggregates. Aggregate OC content was greatest in meso-aggregates (9 g kg-1), followed by micro-aggregates (7 g kg-1), while the least OC content was found in macro-aggregates (3 g kg-1). Although a significart effect (P = 0.000) of aggregate size on aggregate OC content was found, however, our findings did not support the model of aggregate hierarchy.Land use had a significant effect (P = 0.073) on aggregate OC content. The coefficients of variation (CVs) for OC contents associated with each aggregate fraction indicated macro-aggregates as the most variable (CV = 71%). Among the aggregate fractions, the micro-aggregate fraction had a lower CV value of 27%. The mean content of WSA ranged from 15% for macro-aggregates to 84% for micro-aggregates. Geostatistical analysis showed that the measured soil variables exhibited differences in their spatial patterns in both magnitude and space at each aggregate size fraction. The relative nugget variance for most aggregate-associated properties was lower than 45%. The range value for the variogram of water-stable aggregates was almost similar (about 3 km) for the three studied aggregate size classes. The range value for the variogram of aggregate-associated OC contents ranged from about 3 km for macro

  5. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  6. Geostatistical Characteristic of Space -Time Variation in Underground Water Selected Quality Parameters in Klodzko Water Intake Area (SW Part of Poland)

    Science.gov (United States)

    Namysłowska-Wilczyńska, Barbara

    2016-04-01

    . These data were subjected to spatial analyses using statistical and geostatistical methods. The evaluation of basic statistics of the investigated quality parameters, including their histograms of distributions, scatter diagrams between these parameters and also correlation coefficients r were presented in this article. The directional semivariogram function and the ordinary (block) kriging procedure were used to build the 3D geostatistical model. The geostatistical parameters of the theoretical models of directional semivariograms of the studied water quality parameters, calculated along the time interval and along the wells depth (taking into account the terrain elevation), were used in the ordinary (block) kriging estimation. The obtained results of estimation, i.e. block diagrams allowed to determine the levels of increased values Z* of studied underground water quality parameters. Analysis of the variability in the selected quality parameters of underground water for an analyzed area in Klodzko water intake was enriched by referring to the results of geostatistical studies carried out for underground water quality parameters and also for a treated water and in Klodzko water supply system (iron Fe, manganese Mn, ammonium ion NH4+ contents), discussed in earlier works. Spatial and time variation in the latter-mentioned parameters was analysed on the basis of the data (2007÷2011, 2008÷2011). Generally, the behaviour of the underground water quality parameters has been found to vary in space and time. Thanks to the spatial analyses of the variation in the quality parameters in the Kłodzko underground water intake area some regularities (trends) in the variation in water quality have been identified.

  7. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    Science.gov (United States)

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection.

  8. Parameter estimation and determinability analysis applied to Drosophila gap gene circuits

    Directory of Open Access Journals (Sweden)

    Jaeger Johannes

    2008-09-01

    Full Text Available Abstract Background Mathematical modeling of real-life processes often requires the estimation of unknown parameters. Once the parameters are found by means of optimization, it is important to assess the quality of the parameter estimates, especially if parameter values are used to draw biological conclusions from the model. Results In this paper we describe how the quality of parameter estimates can be analyzed. We apply our methodology to assess parameter determinability for gene circuit models of the gap gene network in early Drosophila embryos. Conclusion Our analysis shows that none of the parameters of the considered model can be determined individually with reasonable accuracy due to correlations between parameters. Therefore, the model cannot be used as a tool to infer quantitative regulatory weights. On the other hand, our results show that it is still possible to draw reliable qualitative conclusions on the regulatory topology of the gene network. Moreover, it improves previous analyses of the same model by allowing us to identify those interactions for which qualitative conclusions are reliable, and those for which they are ambiguous.

  9. Discriminant Analysis Applied to the Time—Frequency Energy Vector in Noisy Environment

    Institute of Scientific and Technical Information of China (English)

    TIANYe

    2003-01-01

    Robust speech detection in nolsy environment is an important front-end of speech processing such as speech recognition,speech enhancement and speech coding.Parameters frequently used for speech detection,such as the energy in time domain and the zero-crossing rate,exploit the properties of speech alone.Thus they show poor robustness to background noise.Speech detection in noisy environment should exploit the parameter with wihich speech and noise have maximum classification.In this paper,we propose a robust speech detection algorithm with heteroscedasitc discriminate analysis(HDA)applied to the time-frequency energy yector(TFEV).The TFEV consists of the log energy in time domain,the log energy in the fixed hand 250-3500 Hz.and the log Mel-scale frequency bands energy.Moreover,the bottom-up algorithm with automatic threshold adjustment is used for accurate word boundary detection.Compared to the algorithms based on the energy in time domain,the ATF parameter,the energy and the LDA-MFCC parameter,the proposed algorithm shows better performance under different types of noise.

  10. The evolution of the Journal of Applied Oral Science: a bibliometric analysis.

    Science.gov (United States)

    Ferraz, Valéria Cristina Trindade; Amadei, José Roberto Plácido; Santos, Carlos Ferreira

    2008-01-01

    The purpose of this study was to make a brief diagnosis of the evolution of the Journal of Applied Oral Science (JAOS) between 2005 and 2007, by reviewing quantitative and qualitative aspects of the articles published in the JAOS within this period. All articles published in the JAOS in the time span established for this survey were analyzed retrospectively and a discussion was undertaken on the data referring to the main bibliometric indexes of production, authorship, bibliographic sources of the published articles, and the most frequently cited scientific journals in the main dental research fields. A total of 247 papers authored and co-authored by 1,139 contributors were reviewed, most of them being original research articles. The number of authors per article was 4.61 on the average. Regarding the geographic distribution, the authors represented almost all of the Brazilian States. Most published articles belonged to the following dental research fields: Endodontics, Restorative Dentistry, Dental Materials and Prosthodontics. The ranking of the most frequently cited scientific journals included the most reputable publications in these dental research fields. In conclusion, between 2005 and 2007, the JAOS either maintained or improved considerably its bibliometric indexes. The analysis of the data retrieved in this study allowed evaluating the journal's current management strategies, and identifying important issues that will help outlining the future directions for the internationalization of this journal. PMID:19082402

  11. Superposed epoch analysis applied to large-amplitude travelling convection vortices

    Directory of Open Access Journals (Sweden)

    H. Lühr

    Full Text Available For the six months from 1 October 1993 to 1 April 1994 the recordings of the IMAGE magnetometer network have been surveyed in a search for large-amplitude travelling convection vortices (TCVs. The restriction to large amplitudes (>100 nT was chosen to ensure a proper detection of evens also during times of high activity. Readings of all stations of the northern half of the IMAGE network were employed to check the consistency of the ground signature with the notation of a dual-vortex structure moving in an azimuthal direction. Applying these stringent selection criteria we detected a total of 19 clear TCV events. The statistical properties of our selection resemble the expected characteristics of large-amplitude TCVs. New and unexpected results emerged from the superposed epoch analysis. TCVs tend to form during quiet intervals embedded in moderately active periods. The occurrence of events is not randomly distributed but rather shows a clustering around a few days. These clusters recur once or twice every 27 days. Within a storm cycle they show up five to seven days after the commencement. With regard to solar wind conditions, we see the events occurring in the middle of the IMF sector structure. Large-amplitude TCVs seem to require certain conditions to make solar wind transients 'geoeffective', which have the tendency to recur with the solar rotation period.

    Key words. Ionosphere (Aural ionosphere; Ionosphere- magnetosphere interactions · Magnetospheric Physics (current system

  12. Experimental and NMR theoretical methodology applied to geometric analysis of the bioactive clerodane trans-dehydrocrotonin

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Breno Almeida; Firme, Caio Lima, E-mail: firme.caio@gmail.com, E-mail: caiofirme@quimica.ufrn.br [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil). Instituto de Quimica; Maciel, Maria Aparecida Medeiros [Universidade Potiguar, Natal, RN (Brazil). Programa de Pos-graduacao em Biotecnologia; Kaiser, Carlos R. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Instituto de Quimica; Schilling, Eduardo; Bortoluzzi, Adailton J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Departamento de Quimica

    2014-04-15

    trans-Dehydrocrotonin (t-DCTN) a bioactive 19-nor-diterpenoid clerodane type isolated from Croton cajucara Benth, is one of the most investigated clerodane in the current literature. In this work, a new approach joining X-ray diffraction data, nuclear magnetic resonance (NMR) data and theoretical calculations was applied to the thorough characterization of t-DCTN. For that, the geometry of t-DCTN was reevaluated by X-ray diffraction as well as {sup 1}H and {sup 13}C NMR data, whose geometrical parameters where compared to those obtained from B3LYP/6-311G++(d,p) level of theory. From the evaluation of both calculated and experimental values of {sup 1}H and {sup 13}C NMR chemical shifts and spin-spin coupling constants, it was found very good correlations between theoretical and experimental magnetic properties of t-DCTN. Additionally, the delocalization indexes between hydrogen atoms correlated accurately with theoretical and experimental spin-spin coupling constants. An additional topological analysis from quantum theory of atoms in molecules (QTAIM) showed intramolecular interactions for t-DCTN. (author)

  13. Enhancing DInSAR capabilities for landslide monitoring by applying GIS-based multicriteria filtering analysis

    Science.gov (United States)

    Beyene, F.; Knospe, S.; Busch, W.

    2015-04-01

    Landslide detection and monitoring remain difficult with conventional differential radar interferometry (DInSAR) because most pixels of radar interferograms around landslides are affected by different error sources. These are mainly related to the nature of high radar viewing angles and related spatial distortions (such as overlays and shadows), temporal decorrelations owing to vegetation cover, and speed and direction of target sliding masses. On the other hand, GIS can be used to integrate spatial datasets obtained from many sources (including radar and non-radar sources). In this paper, a GRID data model is proposed to integrate deformation data derived from DInSAR processing with other radar origin data (coherence, layover and shadow, slope and aspect, local incidence angle) and external datasets collected from field study of landslide sites and other sources (geology, geomorphology, hydrology). After coordinate transformation and merging of data, candidate landslide representing pixels of high quality radar signals were filtered out by applying a GIS based multicriteria filtering analysis (GIS-MCFA), which excludes grid points in areas of shadow and overlay, low coherence, non-detectable and non-landslide deformations, and other possible sources of errors from the DInSAR data processing. At the end, the results obtained from GIS-MCFA have been verified by using the external datasets (existing landslide sites collected from fieldworks, geological and geomorphologic maps, rainfall data etc.).

  14. A geostatistical approach to recover the release history of groundwater pollution events; L'approccio geostatistico per la ricostruzione della storia di rilascio di inquinanti in falda

    Energy Technology Data Exchange (ETDEWEB)

    Butera, I.; Tanda, M. G. [Milan Politecnico, Milan (Italy). Dipt. di Ingegneria Idraulica, Ambientale e del Rilevamento

    2001-08-01

    In this work, on the basis of the spatial concentration data available at a given time, the temporal release history of a pollutant is recovered by a geostatistical methodology. The problem in hand belongs to the inverse problems category: in literature different approaches are proposed for their solution. The methodology adopted in this study has been developed by Snodgras and Kitanidis (1997) for one dimensional flow and transport case. In this work the methodology is developed to the case of two dimensional transport (one dimensional transport assumption implies not negligible approximations, even if transversal dispersivity is small compared to the longitudinal one). The study, applied to a literature case, considers the quality of the results and the performance of the algorithm used to implement the procedure with regard of: plume sampling scheme (location and number of the measurement points); the impact of concentration measurement errors; the impact of errors in the aquifer parameters estimate (velocity, longitudinal and transversal dispersion coefficients); erroneous identification of the hydraulic gradient direction. The results of the numerical analysis show that the method provides a likely description of the release history jointed to the estimate error variance. [Italian] Nel presente lavoro si propone un'applicazione di un metodo sviluppato nell'ambito della geostatistica, per la ricostruzione della storia temporale dei rilasci di un agente presente in falda, sulla base dei dati di concentrazione d'inquinante rilevati in diversi punti dell'acquifero. Il problema in esame rientra nella categoria dei problemi inversi, per la cui soluzione in letteratura sono prooposti metodi di impostazione diversa. La metodologia adottata in questo studio e' stata sviluppata ed applicata da Snodgrass e Kitanidis (1997) per condizioni di flusso e trasporto monodimensionali; nella presente memoria, essa e' estesa a non trascurabili, anche

  15. Assessment of ground-water flow and chemical transport in a tidally influenced aquifer using geostatistical filtering and hydrocarbon fingerprinting

    International Nuclear Information System (INIS)

    Traditional environmental investigations at tidally influenced hazardous waste sites such as marine fuel storage terminals have generally failed to characterize ground-water flow and chemical transport because they have been based on only a cursory knowledge of plume geometry, chemicals encountered, and hydrogeologic setting and synoptic ground-water level measurement. Single-time observations cannot be used to accurately determine flow direction and gradient in tidally fluctuating aquifers since these measurements delineate hydraulic head at only one point in time during a tidal cycle, not the net effect of the fluctuations. In this study, a more rigorous approach was used to characterize flow and chemical transport in a tidally influenced aquifer at a marine fuel storage terminal using: (1) ground-water-level monitoring over three tidal cycles (72 hours), (2) geostatistical filtering of ground-water-level data using 25-hour and 71-hour filtering methods, and (3) hydrocarbon fingerprinting analysis. The results from the study indicate that naphtha released from one of the on-site naphtha tanks has been the predominant contributor to the hydrocarbon plume both on-site and downgradient off-site and that net ground-water and hydrocarbon movement has been to the southeast away from the tank since 1989

  16. Geostatistical assessment and validation of uncertainty for three-dimensional dioxin data from sediments in an estuarine river.

    Science.gov (United States)

    Barabás, N; Goovaerts, P; Adriaens, P

    2001-08-15

    Contaminated sediment management is an urgent environmental and regulatory issue worldwide. Because remediation is expensive, sound quantitative assessments of uncertainty aboutthe spatial distribution of contaminants are critical, butthey are hampered bythe physical complexity of sediment environments. This paper describes the use of geostatistical modeling approaches to quantify uncertainty of 2,3,7,8-tetrachlorodibenzo-p-dioxin concentrations in Passaic River (New Jersey) sediments and to incorporate this information in decision-making processes, such as delineation of contaminated areas and additional sampling needs. First, coordinate transformation and analysis of three-dimensional semivariograms were used to describe and modelthe directional variability accounting forthe meandering course of the river. Then, indicator kriging was employed to provide models of local uncertainty at unsampled locations without requiring a prior transform (e.g. log-normal) of concentrations. Cross-validation results show that the use of probability thresholds leads to more efficient delineation of contaminated areas than a classification based on the exceedence of regulatory thresholds by concentration estimates. Depending on whether additional sampling aims at reducing prediction errors or misclassification rates, the variance of local probability distributions or a measure of the expected closeness to the regulatory threshold can be used to locate candidate locations. PMID:11529567

  17. Geostatistical assessment of the impact of World War I on the spatial occurrence of soil heavy metals.

    Science.gov (United States)

    Meerschman, Eef; Cockx, Liesbet; Islam, Mohammad Monirul; Meeuws, Fun; Van Meirvenne, Marc

    2011-06-01

    Previous research showed a regional Cu enrichment of 6 mg kg(-1) in the top soil of the Ypres war zone (Belgium), caused by corrosion of WWI shell fragments. Further research was required since in addition to Cu, also As, Pb, and Zn were used during the manufacturing of ammunition. Therefore, an additional data collection was conducted in which the initial Cu data set was tripled to 731 data points and extended to eight heavy metals (As, Cd, Cr, Cu, Hg, Ni, Pb, and Zn) which permitted (1) to evaluate the environmental impact of the heavy metals at a regional scale and (2) to assess their regional spatial occurrence by performing an optimized geostatistical modeling. The results showed no pollution at a regional scale, but sometimes locally concentrations exceeded the soil sanitation threshold, especially for Cu, Pb, and Zn. The spatial patterns of Ni and Cr were related to variations in soil texture whereas the occurrences of Cu and Pb were clearly linked to WWI activities. This difference in spatial behavior was confirmed by an analysis of coregionalization.

  18. Geostatistical study of spatial correlations of lead and zinc concentration in urban reservoir. Study case Czerniakowskie Lake, Warsaw, Poland

    Science.gov (United States)

    Fabijańczyk, Piotr; Zawadzki, Jarosław; Wojtkowska, Małgorzata

    2016-07-01

    The article presents detailed geostatistical analysis of spatial distribution of lead and zinc concentration in water, suspension and bottom sediments of large, urban lake exposed to intensive anthropogenic pressure within a large city. Systematic chemical measurements were performed at eleven cross-sections located along Czerniakowskie Lake, the largest lake in Warsaw, the capital of Poland. During the summer, the lake is used as a public bathing area, therefore, to better evaluate human impacts, field measurements were carried out in high-use seasons. It was found that the spatial distributions of aqueous lead and zinc differ during the summer and autumn. In summer several Pb and Zn hot-spots were observed, while during autumn spatial distributions of Pb and Zn were rather homogenous throughout the entire lake. Large seasonal differences in spatial distributions of Pb and Zn were found in bottom sediments. Autumn concentrations of both heavy metals were ten times higher in comparison with summer values. Clear cross-correlations of Pb and Zn concentrations in water, suspension and bottom sediments suggest that both Pb and Zn came to Czerniakowskie Lake from the same source.

  19. The "Discrete Trials" of Applied Behavior Analysis for Children with Autism: Outcome-Related Factors in the Case Law

    Science.gov (United States)

    Choutka, Claire Maher; Doloughty, Patricia T.; Zirkel, Perry A.

    2004-01-01

    This study provides an analysis of case law concerning applied behavior analysis (ABA) for students with autism to determine outcome-related factors. The authors classified the 68 pertinent hearing/review officer and court decisions published in EHLR ("Education for Handicapped Law Report") and IDELR ("Individuals with Disabilities Education Law…

  20. Geo-Statistical Approach to Estimating Asteroid Exploration Parameters

    Science.gov (United States)

    Lincoln, William; Smith, Jeffrey H.; Weisbin, Charles

    2011-01-01

    NASA's vision for space exploration calls for a human visit to a near earth asteroid (NEA). Potential human operations at an asteroid include exploring a number of sites and analyzing and collecting multiple surface samples at each site. In this paper two approaches to formulation and scheduling of human exploration activities are compared given uncertain information regarding the asteroid prior to visit. In the first approach a probability model was applied to determine best estimates of mission duration and exploration activities consistent with exploration goals and existing prior data about the expected aggregate terrain information. These estimates were compared to a second approach or baseline plan where activities were constrained to fit within an assumed mission duration. The results compare the number of sites visited, number of samples analyzed per site, and the probability of achieving mission goals related to surface characterization for both cases.

  1. Analysis of the Possibility of Required Resources Estimation for Nuclear Power Plant Decommissioning Applying BIM

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Estimation of decommissioning cost, decommissioning strategy, and decommissioning quantity at the time when entering into any decommissioning plans are some elements whose inputs are mandatory for nuclear power plant decommissioning. Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. This study aims at analyzing whether required resources for decommissioning nuclear power plants can be estimated, applying BIM. To achieve this goal, this study analyzed the status quo of BIM such as definition, characteristics, and areas applied, and made use of them when drawing out study results by examining types and features of the tools realizing BIM. In order to review how BIM could be used for decommissioning nuclear power plants, the definition, characteristics and applied areas of BIM were discussed. BIM designs objects of the structures (walls, slabs, pillars, stairs, windows and doors, etc.) by 3D technology and endows attribute (function, structure and usage) information for each object, thereby providing visualized information of structures for participants in construction projects. Major characteristics of BIM attribute information are as follows: - Geometry: The information of objects is represented by measurable geometric information - Extensible object attributes: Objects include pre-defined attributes, and allow extension of other attributes. Any model that includes these attributes forms relationships with other various attributes in order to perform analysis and simulation. - All information including the attributes are integrated to ensure continuity, accuracy and accessibility, and all information used during the life cycle of structures are supported. This means that when information of required resources is added as another attributes other than geometric

  2. Geostatistics from Digital Outcrop Models of Outcrop Analogues for Hydrocarbon Reservoir Characterisation.

    Science.gov (United States)

    Hodgetts, David; Burnham, Brian; Head, William; Jonathan, Atunima; Rarity, Franklin; Seers, Thomas; Spence, Guy

    2013-04-01

    In the hydrocarbon industry stochastic approaches are the main method by which reservoirs are modelled. These stochastic modelling approaches require geostatistical information on the geometry and distribution of the geological elements of the reservoir. As the reservoir itself cannot be viewed directly (only indirectly via seismic and/or well log data) this leads to a great deal of uncertainty in the geostatistics used, therefore outcrop analogues are characterised to help obtain the geostatistical information required to model the reservoir. Lidar derived Digital Outcrop Model's (DOM's) provide the ability to collect large quantities of statistical information on the geological architecture of the outcrop, far more than is possible by field work alone as the DOM allows accurate measurements to be made in normally inaccessible parts of the exposure. This increases the size of the measured statistical dataset, which in turn results in an increase in statistical significance. There are, however, many problems and biases in the data which cannot be overcome by sample size alone. These biases, for example, may relate to the orientation, size and quality of exposure, as well as the resolution of the DOM itself. Stochastic modelling used in the hydrocarbon industry fall mainly into 4 generic approaches: 1) Object Modelling where the geology is defined by a set of simplistic shapes (such as channels), where parameters such as width, height and orientation, among others, can be defined. 2) Sequential Indicator Simulations where geological shapes are less well defined and the size and distribution are defined using variograms. 3) Multipoint statistics where training images are used to define shapes and relationships between geological elements and 4) Discrete Fracture Networks for fractures reservoirs where information on fracture size and distribution are required. Examples of using DOM's to assist with each of these modelling approaches are presented, highlighting the

  3. Preliminary risk analysis applied to the handling of health-care waste

    Directory of Open Access Journals (Sweden)

    Carvalho S.M.L.

    2002-01-01

    Full Text Available Between 75% and 90% of the waste produced by health-care providers no risk or is "general" health-care waste, comparable to domestic waste. The remaining 10-25% of health-care waste is regarded as hazardous due to one or more of the following characteristics: it may contain infectious agents, sharps, toxic or hazardous chemicals or it may be radioactive. Infectious health-care waste, particularly sharps, has been responsible for most of the accidents reported in the literature. In this work the preliminary risks analysis (PRA technique was used to evaluate practices in the handling of infectious health-care waste. Currently the PRA technique is being used to identify and to evaluate the potential for hazard of the activities, products, and services from facilities and industries. The system studied was a health-care establishment which has handling practices for infectious waste. Thirty-six procedures related to segregation, containment, internal collection, and storage operation were analyzed. The severity of the consequences of the failure (risk that can occur from careless management of infectious health-care waste was classified into four categories: negligible, marginal, critical, and catastrophic. The results obtained in this study showed that events with critics consequences, about 80%, may occur during the implementation of the containment operation, suggesting the need to prioritize this operation. As a result of the methodology applied in this work, a flowchart the risk series was also obtained. In the flowchart the events that can occur as a consequence of a improper handling of infectious health-care waste, which can cause critical risks such as injuries from sharps and contamination (infection from pathogenic microorganisms, are shown.

  4. Towards an Analysis of Review Article in Applied Linguistics: Its Classes, Purposes and Characteristics

    Science.gov (United States)

    Azar, Ali Sorayyaei; Hashim, Azirah

    2014-01-01

    The classes, purposes and characteristics associated with the review article in the field of applied linguistics were analyzed. The data were collected from a randomly selected corpus of thirty two review articles from a discipline-related key journal in applied linguistics. The findings revealed that different sub-genres can be identified within…

  5. Image smoothing of multispectral imagery based on the HNN and geo-statistics

    Institute of Scientific and Technical Information of China (English)

    Nguyen Quang Minh

    2011-01-01

    A new method for image down-scaling using geostatistical interpolation or smoothing based on the Hopfield Neural Network (HNN) and zero semivariance value is introduced.The method utilises the smoothing effect of the semivariogram matching process to produce the smoothened sub-pixel multispectral (MS) image with smaller RMSEs in comparison with the bilinear interpolation.In fact,the zero semivariograms increase the spatial correlation between the adjacent sub-pixels of the superresolution image.Containing higher spatial correlation,the resulting super-resolution MS image has smaller RMSEs compared with the original coarse image.

  6. New advances in methodology for statistical tests useful in geostatistical studies

    Energy Technology Data Exchange (ETDEWEB)

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  7. Evaluation of geostatistical parameters based on well tests; Estimation de parametres geostatistiques a partir de tests de puits

    Energy Technology Data Exchange (ETDEWEB)

    Gauthier, Y.

    1997-10-20

    Geostatistical tools are increasingly used to model permeability fields in subsurface reservoirs, which are considered as a particular random variable development depending of several geostatistical parameters such as variance and correlation length. The first part of the thesis is devoted to the study of relations existing between the transient well pressure (the well test) and the stochastic permeability field, using the apparent permeability concept.The well test performs a moving permeability average over larger and larger volume with increasing time. In the second part, the geostatistical parameters are evaluated using well test data; a Bayesian framework is used and parameters are estimated using the maximum likelihood principle by maximizing the well test data probability density function with respect to these parameters. This method, involving a well test fast evaluation, provides an estimation of the correlation length and the variance over different realizations of a two-dimensional permeability field

  8. Evaluation of Contaminations and Sources of Heavy Metals in Sediments at Samrak Delta of Nakdong River in Busan, Korea Using Geostatistical Methods

    Science.gov (United States)

    Chung, Sang Yong; Senapathi, Venkatramanan; Khakimov, Elyor; Selvam, Sekar; Oh, Yun Yeong

    2016-04-01

    This research used several geostatistical methods to assess heavy metal contaminations and their sources of sediments at Samrak Delta of Nakdong River in Busan, Korea. The mean concentration of heavy metals in sediments were Fe (16.42 %), Al (15.56 %), Mn (0.31 %), Zn, Pb, Cr (0.03 %), Ni (0.02 %) and Cu (0.008 %), which were mainly attributed to the intense industrial and irrigation activities, and also geogenic sources. Groundwater in the sediments also contains the high concentrations of heavy metals such as Fe and Mn. Canonical correlation analysis (CCA) exhibited a significant relationship between physicochemical parameters (sand, silt, clay, TOC, CaCO3) and heavy metals (Fe, Al, Mn, Zn, Pb, Cr, Ni, Cu), and the importance of physicochemical parameters in regulating the amount of heavy metals in sediments. Artificial neural network (ANN) showed a good correlation and model efficiency for simulated outputs except Fe, Pb and Zn. Silt, Clay, TOC and CaCO3 controlled the concentrations of heavy metals in sediments. Principal component analysis (PCA) produced two factor loadings of PCA 1 of Fe, Mn, Pb, TOC, Cr, silt and Al, 75.4 % in variance, and PCA 2 of Cu, Ni, Zn and CaCO3, 24.6 % in variance. It suggested that heavy metals were originated from geogenic sources and effluents from industries. Cluster analysis (CA) was helpful for the classification of contamination sources of heavy metals. This study suggests that geostatistical techniques are essentially necessary for the effective management of heavy metal contaminations and policy decision making processes to reduce the contamination level of heavy metals in deltaic region.

  9. Geographical analysis of parking land use in Genaveh applying AHP Model

    Directory of Open Access Journals (Sweden)

    Gh. Hosseini Lagha

    2012-01-01

    Full Text Available Extended Abstract 1- IntroductionEach year Genaveh Port receives millions of tourists (2302154 people in Nowruz Holidays in 1390 from all over the country for its tourist attractions and trade and recreation centers. The presence of this huge populace has created several traffic problems, the main root of which can be found in the shortage or the incorrect positioning of the function of parking lots in this city. However, the issue gets into its peak in the time there is boom in tourism in holiday seasons. field data, a questionnaire with a sample size of 320 participants including 100 citizens and 220 tourists has been incorporated, This applied study uses a descriptive analytical research design.In the analysis of the current situation, and for providing the and SPSS has been used for further analysis. After assessing the area and the number of required parking lots applying parking building methods, the influential standards in situating public parking lots has been weighted through the analytic hierarchy process (AHP in Arc GIS software zone, and then appropriate places for launching parking lots has been recognized by OWA (Ordered Weighted Average fuzzy method. Results show that with regard to the need for 1863 parking lots in addition to the current lots in the studied area in one day, there exists no logical relations between the demanded parking lots and the existing parking lots in the city of Genaveh. Nevertheless, it should be admitted that the present parking lots in the city are appropriate considering the geographical standards; but limitations exist with regard to the number of these lots. Hence, considering the characteristics of this city, the most logical option is the use of smart parking. 2- Theoretical Bases- Traffic Traffic is an international term; it means transportation vehicles and passersby walk on roads and adding three humans, vehicle and road are formed (Rezaei, 1369, p. 7.- Stop surface carThe average surface to

  10. Decoding dynamic brain patterns from evoked responses: A tutorial on multivariate pattern analysis applied to time-series neuroimaging data

    OpenAIRE

    Grootswagers, Tijl; Wardle, Susan G.; Carlson, Thomas A.

    2016-01-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analysing fMRI data. Although decoding methods have been extensively applied in Brain Computing Interfaces (BCI), these methods have only recently been applied to time-series neuroimaging data such as MEG and EEG to address experimental questions in Cognitive Neuroscience. In a tutorial-style review, we describe a broad set of options to inform future time-series decoding studies from a Cognitive N...

  11. Geostatistics project of the National Uranium Resource Evaluation program

    International Nuclear Information System (INIS)

    Additional work has been done to display radiometric data from the Lubbock quadrangle in pseudocolor maps. A digitized topographic map of the quadrangle was obtained from the U.S. Geological Survey and this is being incorporated into the study of the radiometric data. Single-record data from the Lake Mead calibration range and from the Slayton test line have been obtained from GeoMETRICS: Inc. and analysis of this data has begun. Principal component analyses have been used to investigate the relationship of geological formation to the location of points in a (Tl, Bi, K) coordinate system. LASL personnel attended a workshop in Grand Junction, Colorado, where some of the problems of calibrating aerial gamma-ray spectrometers were addressed

  12. A geostatistical approach for describing spatial pattern in stream networks

    Science.gov (United States)

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  13. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    Science.gov (United States)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    the inversion procedure a genetical algorithm (GA) was used. Specific features such as elitism, roulette-wheel process for selection operator and island theory were implemented. Optimization was based on the water content measurements recorded at several depths. Ten scenarios have been elaborated and applied on the two lysimeters in order to investigate the impact of the conceptual model in terms of processes description (mechanistic or compartmental) and geometry (number of horizons in the profile description) on the calibration accuracy. Calibration leads to a good agreement with the measured water contents. The most critical parameters for improving the goodness of fit are the number of horizons and the type of process description. Best fit are found for a mechanistic model with 5 horizons resulting in absolute differences between observed and simulated water contents less than 0.02 cm3cm-3 in average. Parameter estimate analysis shows that layers thicknesses are poorly constrained whereas hydraulic parameters are much well defined.

  14. Building a geological reference platform using sequence stratigraphy combined with geostatistical tools

    Science.gov (United States)

    Bourgine, Bernard; Lasseur, Éric; Leynet, Aurélien; Badinier, Guillaume; Ortega, Carole; Issautier, Benoit; Bouchet, Valentin

    2015-04-01

    In 2012 BRGM launched an extensive program to build the new French Geological Reference platform (RGF). Among the objectives of this program is to provide the public with validated, reliable and 3D-consistent geological data, with estimation of uncertainty. Approx. 100,000 boreholes over the whole French national territory provide a preliminary interpretation in terms of depths of main geological interfaces, but with an unchecked, unknown and often low reliability. The aim of this paper is to present the procedure that has been tested on two areas in France, in order to validate (or not) these boreholes, with the aim of being generalized as much as possible to the nearly 100,000 boreholes waiting for validation. The approach is based on the following steps, and includes the management of uncertainty at different steps: (a) Selection of a loose network of boreholes owning a logging or coring information enabling a reliable interpretation. This first interpretation is based on the correlation of well log data and allows defining 3D sequence stratigraphic framework identifying isochronous surfaces. A litho-stratigraphic interpretation is also performed. Be "A" the collection of all boreholes used for this step (typically 3 % of the total number of holes to be validated) and "B" the other boreholes to validate, (b) Geostatistical analysis of characteristic geological interfaces. The analysis is carried out firstly on the "A" type data (to validate the variogram model), then on the "B" type data and at last on "B" knowing "A". It is based on cross-validation tests and evaluation of the uncertainty associated to each geological interface. In this step, we take into account inequality constraints provided by boreholes that do not intersect all interfaces, as well as the "litho-stratigraphic pile" defining the formations and their relationships (depositing surfaces or erosion). The goal is to identify quickly and semi-automatically potential errors among the data, up to

  15. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study

    Science.gov (United States)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-05-01

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures.

  16. Analysis of design parameters for crosstalk cancellation filters applied to different loudspeaker configurations

    DEFF Research Database (Denmark)

    Lacouture Parodi, Yesenia; Rubak, Per

    2011-01-01

    for crosstalk cancellation filters applied to different loudspeaker configurations has not yet been addressed systematically. A study of three different inversion techniques applied to several loudspeaker arrangements is documented. Least-squares approximations in the frequency and time domains are evaluated......Several approaches to render binaural signals through loudspeakers have been proposed in the past decades. Some studies have focused on the optimum loudspeaker arrangement whereas others have proposed more efficient filters. However, to our knowledge the identification of optimum parameters...

  17. Comparative analysis of several sediment transport formulations applied to dam-break flows over erodible beds

    Science.gov (United States)

    Cea, Luis; Bladé, Ernest; Corestein, Georgina; Fraga, Ignacio; Espinal, Marc; Puertas, Jerónimo

    2014-05-01

    Transitory flows generated by dam failures have a great sediment transport capacity, which induces important morphological changes on the river topography. Several studies have been published regarding the coupling between the sediment transport and hydrodynamic equations in dam-break applications, in order to correctly model their mutual interaction. Most of these models solve the depth-averaged shallow water equations to compute the water depth and velocity. On the other hand, a wide variety of sediment transport formulations have been arbitrarily used to compute the topography evolution. These are based on semi-empirical equations which have been calibrated under stationary and uniform conditions very different from those achieved in dam-break flows. Soares-Frazao et al. (2012) proposed a Benchmark test consisting of a dam-break over a mobile bed, in which several teams of modellers participated using different numerical models, and concluded that the key issue which still needs to be investigated in morphological modelling of dam-break flows is the link between the solid transport and the hydrodynamic variables. This paper presents a comparative analysis of different sediment transport formulations applied to dam-break flows over mobile beds. All the formulations analysed are commonly used in morphological studies in rivers, and include the formulas of Meyer-Peter & Müller (1948), Wong-Parker (2003), Einstein-Brown (1950), van Rijn (1984), Engelund-Hansen (1967), Ackers-White (1973), Yang (1973), and a Meyer-Peter & Müller type formula but with ad-hoc coefficients. The relevance of corrections on the sediment flux direction and magnitude due to the bed slope and the non-equilibrium hypothesis is also analysed. All the formulations have been implemented in the numerical model Iber (Bladé et al. (2014)), which solves the depth-averaged shallow water equations coupled to the Exner equation to evaluate the bed evolution. Two different test cases have been

  18. Timescape: a simple space-time interpolation geostatistical Algorithm

    Science.gov (United States)

    Ciolfi, Marco; Chiocchini, Francesca; Gravichkova, Olga; Pisanelli, Andrea; Portarena, Silvia; Scartazza, Andrea; Brugnoli, Enrico; Lauteri, Marco

    2016-04-01

    Environmental sciences include both time and space variability in their datasets. Some established tools exist for both spatial interpolation and time series analysis alone, but mixing space and time variability calls for compromise: Researchers are often forced to choose which is the main source of variation, neglecting the other. We propose a simple algorithm, which can be used in many fields of Earth and environmental sciences when both time and space variability must be considered on equal grounds. The algorithm has already been implemented in Java language and the software is currently available at https://sourceforge.net/projects/timescapeglobal/ (it is published under GNU-GPL v3.0 Free Software License). The published version of the software, Timescape Global, is focused on continent- to Earth-wide spatial domains, using global longitude-latitude coordinates for samples localization. The companion Timescape Local software is currently under development ad will be published with an open license as well; it will use projected coordinates for a local to regional space scale. The basic idea of the Timescape Algorithm consists in converting time into a sort of third spatial dimension, with the addition of some causal constraints, which drive the interpolation including or excluding observations according to some user-defined rules. The algorithm is applicable, as a matter of principle, to anything that can be represented with a continuous variable (a scalar field, technically speaking). The input dataset should contain position, time and observed value of all samples. Ancillary data can be included in the interpolation as well. After the time-space conversion, Timescape follows basically the old-fashioned IDW (Inverse Distance Weighted) interpolation Algorithm, although users have a wide choice of customization options that, at least partially, overcome some of the known issues of IDW. The three-dimensional model produced by the Timescape Algorithm can be

  19. Development of a sampling strategy for young stands of Eucalyptus sp. using geostatistics

    Directory of Open Access Journals (Sweden)

    Adriana Leandra de Assis

    2009-06-01

    Full Text Available This work evaluated the potential of a geostatistical interpolator for defining strata and for comparing the stratificationgenerated by the interpolator with stratification based on data records, on the basis of sampling error. Data were collected from aclonal stand of eucalyptus encompassing 164.08 ha of area and located in the municipality of Aracruz, Espírito Santo state. In 2003,49 plots were allocated and in 2004 another 50 plots were distributed systematically in the area. In 2005, all plots were remeasured.The characteristic evaluated eachmeasurement year was volume outside bark . Spherical and exponential models were fitted to theexperimental semivariograms using the maximum likelihood method. The selected model, following the Akaike Information Criterion,was the exponential model. Based on degree of spatial dependence (SD it was possible to assess the spatial continuity structure ofthe characteristic of interest. The variable volume outside bark was found to be spatially structured in allmeasurement years and thedegree of spatial dependence varied according to forest age. This indicates that statistical analyses should consider the spatialcomponent in the inference process at the ages considered in this study, in particular area classification based on yield. Resultsshowed that the geostatistical interpolator can be used for establishing strata and locating permanent sample plots in young stands ofEucalyptus sp.

  20. Acceleration of the Geostatistical Software Library (GSLIB) by code optimization and hybrid parallel programming

    Science.gov (United States)

    Peredo, Oscar; Ortiz, Julián M.; Herrero, José R.

    2015-12-01

    The Geostatistical Software Library (GSLIB) has been used in the geostatistical community for more than thirty years. It was designed as a bundle of sequential Fortran codes, and today it is still in use by many practitioners and researchers. Despite its widespread use, few attempts have been reported in order to bring this package to the multi-core era. Using all CPU resources, GSLIB algorithms can handle large datasets and grids, where tasks are compute- and memory-intensive applications. In this work, a methodology is presented to accelerate GSLIB applications using code optimization and hybrid parallel processing, specifically for compute-intensive applications. Minimal code modifications are added decreasing as much as possible the elapsed time of execution of the studied routines. If multi-core processing is available, the user can activate OpenMP directives to speed up the execution using all resources of the CPU. If multi-node processing is available, the execution is enhanced using MPI messages between the compute nodes.Four case studies are presented: experimental variogram calculation, kriging estimation, sequential gaussian and indicator simulation. For each application, three scenarios (small, large and extra large) are tested using a desktop environment with 4 CPU-cores and a multi-node server with 128 CPU-nodes. Elapsed times, speedup and efficiency results are shown.